Computer simulations using advanced mathematical techniques such as agent based modelling and system dynamics are used by hedge funds to make vast amounts of money. Hedge funds operate in a complex and continually evolving competitive global system.
Advanced simulations can not only be used to make vast amounts of money for hedge funds; they must be used on a daily basis to design economic policies in the 21st century so we avert tomorrow's crisis.
Survival and success factors include having strong controls and an outstanding reputation to attract and retain both investors and the brightest people they can find,
  • who are continually being fed better information from their contacts and by their data models;
  • who can execute their decisions rapidly and without error, and
  • who are able to continually test their ideas and to learn from their own and others’ experiences.
They don’t have to get it right all the time, but they do have to survive and outperform their competitors (which include less risky investment alternatives) in bad times as well as in good. Hedge funds are filled with the really bright guys ('Masters of the Universe' - to quote Tom Wolfe - e.g., the twenty year top-performing ex-mathematician James Simon of Renaissance Capital). These bright guys use their superior modelling and risk management techniques in a highly leveraged way to make regular exceptional profits for their partners and themselves. Even so, only a few practitioners are believed to have begun to use models that take into account irrational and collective behaviour hence the frequent, recent, and regular whining about ‘once in a hundred year’ events and ‘25 sigma deviations’ (e.g. from a Goldman’s hedge fund), but see recent reports in Nature and New Scientist and work by the Econophysics community.
While many of these guys have a strong math or physics background, the rest of the market (and its regulators and risk managers) is filled with classical traders (typically chartists and people who sense what’s happening by talking to people and looking at patterns on screens) and people who have, since the eighties, been to business school where they were taught Standard Finance Theory (SFT) and its companion the Efficient Market Hypothesis. SFT comes from a world where computation is expensive and short cut assumptions (such as the use of only the first two moments of a distribution - mean and variance - which was also assumed to be Gaussian) justified the arguments in favour of the random walk approach to investing, the CAPM, use of alpha and beta, adoption of the brilliant Black-Scholes derivative pricing formula based on the stochastic calculus and so on. These arguments have been rattling on for years (and it’s the conventional SFT approach that seems to have been adopted by regulators and risk managers) although there is lots of evidence to suggest that chartists make their money from adopting the opposite viewpoint - which seemingly is related to the market impact of traders themselves. It seems reasonable to suggest that, in the future, leading hedge funds can gain additional benefit from a smarter approach to modelling that begins to incorporate more of the micro (types of trader/institutional, their states and probable behaviours under various market conditions and its evolution) as well as the existing macro level of modelling.
Following some spectacular disasters in the eighties (and complaints about Japanese banks getting unfair advantages by excessive balance sheet growth for the capital employed), regulators came up with Basel I (and then Basel II) which imposed controls on banks’ risk-adjusted capital ratios but without accompanying it with proper investigation of how they were implemented (hence the drive by most banks to use off-balance sheet vehicles and other tactics to get round the regulations and grow their earnings). Until recently nobody paid much attention to what happens at a system level when everyone is corralled into adopting the same mechanism - another unintended consequence of standardised regulation was seen in the raging arguments over the implementation of the ‘mark to market’ regime. One of the biggest issues in the 2008 crisis has been a largely non-systemic approach to modelling and measuring risk i.e., looking at an individual institution’s portfolio of instruments, their past correlations, risk parameters etc. as if it could, in a crisis, act in isolation when regulation has, of course, created systemic correlations which come into play as soon as systemic risk begins to arise. A smart hedge fund might have anticipated these probable behaviours and taken advantage of them at an early stage.
This lack of insight into complex system behaviour and its opportunities (and threats) may possibly be put down to the prevalence of a quantitative mind set derived from the relatively predictable world of mathematics and physics, which may be fine while the complex financial system is in one of its relatively stable states. When that is no longer the case (Black Swan events?), rather more powerful mental models coming from biology (e.g. evolutionary behaviour) and from non-linear complex adaptive systems (e.g., climate or enzyme kinetics in biochemistry) may be  needed. Indeed one biological concept - punctuated equilibrium - could fit the 2008 credit crunch very well; the post crunch world will be quite likely to have rather different dynamics.
This is not new. Way back in 2006, scientists had already identified opportunities to use advanced modelling techniques to understand economies. From The Cambrian age of economics in The Economist print edition (Jul 20th 2006):
 Eric Beinhocker, of the McKinsey Global Institute, in his book “The Origin of Wealth: Evolution, Complexity, and the Radical Remaking of Economics” argues that economists should abandon blackboard deduction in favour of computer simulation. The economists he likes do not “solve” models of the economy—deducing the prices and quantities that will prevail in equilibrium—rather they grow them “in silico”, as he puts it.
. . .
An early example is the sugarscape simulation done in 1995 by Joshua Epstein and Robert Axtell, of the Brookings Institution. On a computer-generated landscape, studded with “sugar” mountains, they scattered a variety of simple, sugar-eating creatures, which compete for this precious commodity. Some creatures move faster than others, some see farther, and some burn sugar at a higher metabolic rate than their rivals.
. . .
Such simulations may be unpredictable, but they are nonetheless understandable, Mr Beinhocker insists. By toying with different parameters, such as metabolic rates or the height of the sugar mountains, analysts can learn how to “tune” their model to generate different results. This understanding may be more valuable than a forecast, he argues. But whatever such enlightenment is worth, it is not easy to communicate to others. The revelations contained in a deductive proof or theorem are easy to pass on: they leave a set of footprints for other people to follow, making it easy for a theory to persuade and convert. For simulations, by contrast, “the only way to see what happens is to run the model and evolve it—there is no shortcut.”
Models built using insights from complex adaptive systems, that is, using agent-based modelling, system dynamics and so on, are transforming fields from biochemistry to epidemiology to ecology to weather forecasting, and whilst penetration into hedge funds is accelerating, uptake by policymakers remains slow. Advanced simulations can not only be used to make vast amounts of money for hedge funds; they must be used on a daily basis to design economic policies in the 21st century so we avert tomorrow's crisis.
Steven Keen, who has spent "40 years fighting delusions in economics", outlines eloquently why "economics must undergo a long-overdue intellectual revolution" in his book, Debunking Economics. The Queen of England famously asked academic economists at the London School of Economics about the 2008 crisis, "Why did nobody notice it?" Their response was that they "lost sight of the bigger picture" and that no one could have seen the crisis coming. As Professor Keen puts it:
Balderdash. Though the precise timing of the crisis was impossible to pick, a systemic crisis was both inevitable and, to astute observers [NB: hedge funds] in the mid 2000s, likely to occur in the very near future.
In a much more recent article, Greece crisis: Better models can show how to stabilise eurozone in The New Scientist (7 July 2015):
Doyne Farmer of the Santa Fe Institute in New Mexico and the University of Oxford says leaders and their economic advisers have no way to work out quantitatively which mix of solutions works best, so fall back on ideological preferences and whatever historical examples support them. Deadlock ensues.
Mainstream models, called Dynamic Stochastic General Equilibrium models, assume that left to their own devices, economic systems will reach “equilibrium” as a result of buyers and sellers independently acting to try to maximise their own benefit. They assume that periodic disturbances are due to outside influences, not produced spontaneously within the system.
“Empirical studies now show this cannot be true,” says Farmer. Such models also failed spectacularly to predict or explain the financial crisis of 2008. “The atomistic, optimising agents underlying existing models do not capture behaviour during a crisis period,” European Central Bank president Jean-Claude Trichet said in 2010.
What might work, some researchers said then, are agent-based models (ABMs), which use modern computers’ number-crunching power to simulate people and institutions who do not necessarily behave optimally, and who interact.
Paul Mason, economics editor at Channel 4 News and author of Post-Capitlism, recently wrote in the article, 'My wish for 2015: a machine to judge political claims against reality'.
If I could rub an empty lager can, and get a genie to appear and grant me one wish for 2015, it would be for something apparently banal but revolutionary: an accurate simulation of the economy.
Some points for today are therefore that survival and success in a complex evolving environment such as the global finance system require the continual development, deployment and effective use of simulation models that reflect more features of reality than those used by competitors. Even though this may not be a guarantee of survival and success, it at least would be an indicator to investors and others of a greater chance of long-term survival. Some aspects of such models require both top-down (macro - using system dynamics) and bottom up (micro - using agent based and discrete event) modelling. This is one of Providence's key strengths and, when combined with powerful visualisation for better insights and powerful number crunching for ever closer approaches to real-time sensing and response, may be the ultimate differentiator of the hedge fund of the future. Simudyne is making that world possible.