Next week Mike and I will be in Italy for Sunbelt 2010 in Riva Del Garda. Sunbelt features scholars from across the social and physical sciences — Mathematical Sociology, Political Science, Economics, Organization Studies as well as Physics and Applied Math. For those of you attending, we looking forward to seeing you in Italy!
“When Tom Cruise put on his data glove and started whooshing through video clips of future crimes, how many of us felt the stirrings of geek lust? This iconic scene in Minority Report marked a change in popular thinking about interfaces — showing how sexy it could be to use natural gestures, without keyboard, mouse or command line. John Underkoffler led the team that came up with this interface, called the g-speak Spatial Operating Environment. His company, Oblong Industries, was founded to move g-speak into the real world. Oblong is building apps for aerospace, bioinformatics, video editing and more. But the big vision is ubiquity: g-speak on every laptop, every desktop, every microwave oven, TV, dashboard. ‘It has to be like this,” he says. “We all of us every day feel that. We build starting there. We want to change it all.’ Before founding Oblong, Underkoffler spent 15 years at MIT’s Media Laboratory, working in holography, animation and visualization techniques, and building the I/O Bulb and Luminous Room Systems.”
As highlighted on Marginal Revolution, economist Herb Gintis has authored an Amazon.com review of the book “Meltdown: A Free-Market Look at Why the Stock Market Collapsed, the Economy Tanked, and Government Bailouts Will Make Things Worse” by Thomas E. Woods Jr. Suffice to say, the review is not flattering. Those interested in the direct attack on the book can read the full review here. Our particular interest in his review lies in the last third of the text where Professor Gintis highlights the genuine weaknesses of current macroeconomic theory. Below is the relevant text:
“I am often asked why macroeconomic theory is in such an awful state. The answer is simple. The basic model of the market economy was laid out by Leon Walras in the 1870’s, and its equilibrium properties were well established by the mid-1960’s. However, no one has succeeded in establishing its dynamical properties out of equilibrium. But macroeconomic theory is about dynamics, not equilibrium, and hence macroeconomics has managed to subsist only by ignoring general equilibrium in favor of toy models with a few actors and a couple of goods. Macroeconomics exists today because we desperately need macro models for policy purposes, so we invent toy models with zero predictive value that allow us to tell reasonable policy stories, the cogency of which are based on historical experience, not theory.
I think it likely that macroeconomics will not become scientifically presentable until we realize that a market economy is a complex dynamic nonlinear system, and we start to use the techniques of complexity analysis to model it. I present my arguments in Herbert Gintis, “The Dynamics of General Equilibrium“, Economic Journal 117 (2007):1289-1309.
While we do not necessarily agree with every point made in his review, the general thrust of the above argument is directly in line with the thinking of many here at the Center for the Study of Complex Systems. Indeed, the rebuke offered above could be extended and applied to other work in Economics and Political Science. A significant part of the problem is that the analytical apparatus in question is simply not up to the complexity of the relevant problems. Most of the current approaches derive from an era when a CPU had a transistor count of less than 10k and memory was exceedingly expensive. It is not as though leading scholars of the day were completely unaware that most systems are far more intricate than a “few actors and a few goods.” However, tractability concerns created a strong incentive to develop models which could be solved analytically.
Moderately high-end machines now have transistor counts of greater than 2,000,000,000 and memory is incredibly cheap (see generally Moore’s Law). No need to impose fixed point equilibrium assumptions when there is no qualitative justification for eliminating the possibility that limit cycle attractors, strange attractors or some class of dynamics are, in fact, the genuine dynamics of the system. We have previously highlighted the press release “What Computer Science Can Teach Economics“ (and other social sciences). This is really important work. However, it is really only the beginning.
More realistic representations of these complex systems are possible, however, it requires scholars to consider jettisoning analytical approaches/solutions. When modeling complex adaptive systems far more granularity is possible but this requires a direct consideration of questions of computation and computational complexity. The use of a computational heuristic is really not that problematic and it can help sidestep truly hard problems (i.e. NP Complete and the like). The difficult question is how and under what conditions one should select among the available set of such heuristics.
It is important to note, the dominant paradigm was itself a heuristic representation of agent behavior (and a useful one). While there are still some true believers, a declining number of serious scholars still assert that individuals are actually perfect rational maximizers. At best, this assumption is a useful guidepost for agent behavior and is one which can be subjected to revision by continued work in behavioral economics and neuroeconomics.
For those looking for a genuine intellectual arbitrage opportunity … the path is clear … devote your time to filling the space as this is a space with significant potential returns. The way forward is to remix traditional approaches with leading findings in neuroscience, psychology, institutional analysis and most importantly computer science … winner gets a call from Sweden in about t+25.
This summer in the Complex Systems Advanced Academic Workshop we are devoting attention to information theory. In collecting some materials about Claude Shannon, I came across the above video and thought I would share it with others. Here is the description … “Considered the founding father of the electronic communication age, Claude Shannon’s work ushered in the Digital Revolution. This fascinating program explores his life and the major influence his work had on today’s digital world through interviews with his friends and colleagues.”
The Financial Times’s Alphaville blog recently covered a number of quantitative models for predicting World Cup outcomes – models developed by well-known “quant” desks. Though this may seem like a waste of brains and shareholder value, World Cup outcomes are historically predictive of regional equity performance; furthermore, recent trends in securitization have not passed over sports as large as soccer. Here are the respective desks’ picks:
- JPM: England 1st, Spain 2nd, Netherlands 3rd (notes)
- UBS: Brazil 1st, Germany 2nd, Italy 3rd (notes, p. 37)
- GS: England, Argentina, Brazil, Spain (unranked) (notes, p. 71)
- Dankse Bank: Brazil 1st, Germany 2nd (notes)
As could be expected, there is some disagreement as to the value of these predictions. Gary Jenkins of Evolution Securities chimes in with his own thoughts:
Yes it’s that time again when analysts like me who can barely predict what is going to happen in the market the following day turn away from our area of so called expertise and instead focus our attention on who is going to win the World Cup. I first got involved in this attempt to get some publicity 8 years ago, when Goldman Sachs produced a report combining economics and the World Cup and included their predictions as to who would get to the last four (I believe they got them all wrong) and had Sir Alex Ferguson pick his all time best World Cup team. I decided to do the same thing but had to explain that we could not afford Sir Alex. Thus I got my dad to pick his all time team. It caused more client complaints than most of my research and my favourites to win the tournament got knocked out early, so I abandoned this kind of research for a while.
Again, for more interesting coverage of the real-world effects of the World Cup, see FT Alphaville’s South Africa 2010 series. P.S. Go Azzurri this afternoon!
This is a bit far afield for the typical things we highlight on this blog. However, we thought this was an interesting story. Both Tom Schaller (538.com) and John Sides (The Monkey Cage) offer good initial analysis of the outcome. The other three references are simply offered for those seeking background information on the controversy.
Something Fishy in the South Carolina Primary (Tom Schaller @ 538.com),
Did Alvin Greene Win Because of Ballot Order? Because of Race? (John Sides @ The Monkey Cage)
If this does not load please click here to access the TED page.
Along with several other colleagues from Michigan CSCS, Mike, Jon and I are working at the 2010 OMnI / Shodor Workshop. The workshop is being run out of the Computer Science Department at Oberlin College and is designed to introduce various members of the Oberlin faculty to mathematical modeling and computational thinking. We just completed the first day and are looking forward to a great week!
Click on the above picture and you will be taken to the Interactive Gallery of Computational Legal Studies. Once inside the gallery, click on any thumbnail to see the full size image. Each image features a link to supporting materials such as documentation and/or the underlying academic paper. We hope to add more content to gallery over the coming weeks and months — so please check back! Please note that load time may vary depending upon your connection, machine, etc.