Loading AI tools
Application of physics to the study of economics From Wikipedia, the free encyclopedia
Econophysics is a non-orthodox (in economics) interdisciplinary research field, applying theories and methods originally developed by physicists in order to solve problems in economics, usually those including uncertainty or stochastic processes and nonlinear dynamics. Some of its application to the study of financial markets has also been termed statistical finance referring to its roots in statistical physics. Econophysics is closely related to social physics.
Physicists' interest in the social sciences is not new (see e.g.,[1]); Daniel Bernoulli, as an example, was the originator of utility-based preferences. One of the founders of neoclassical economic theory, former Yale University Professor of Economics Irving Fisher, was originally trained under the renowned Yale physicist, Josiah Willard Gibbs.[2] Likewise, Jan Tinbergen, who won the first Nobel Memorial Prize in Economic Sciences in 1969 for having developed and applied dynamic models for the analysis of economic processes, studied physics with Paul Ehrenfest at Leiden University. In particular, Tinbergen developed the gravity model of international trade that has become the workhorse of international economics.[citation needed]
Econophysics was started in the mid-1990s by several physicists working in the subfield of statistical mechanics. Unsatisfied with the traditional explanations and approaches of economists – which usually prioritized simplified approaches for the sake of soluble theoretical models over agreement with empirical data – they applied tools and methods from physics, first to try to match financial data sets, and then to explain more general economic phenomena.[citation needed]
One driving force behind econophysics arising at this time was the sudden availability of large amounts of financial data, starting in the 1980s. It became apparent that traditional methods of analysis were insufficient – standard economic methods dealt with homogeneous agents and equilibrium, while many of the more interesting phenomena in financial markets fundamentally depended on heterogeneous agents and far-from-equilibrium situations.[citation needed]
The term "econophysics" was coined by H. Eugene Stanley, to describe the large number of papers written by physicists in the problems of (stock and other) markets, in a conference on statistical physics in Kolkata (erstwhile Calcutta) in 1995 and first appeared in its proceedings publication in Physica A 1996.[3][4] The inaugural meeting on econophysics was organised in 1998 in Budapest by János Kertész and Imre Kondor. The first book on econophysics was by R. N. Mantegna & H. E. Stanley in 2000.[5]
The almost regular meeting series on the topic include: Econophys-Kolkata (held in Kolkata & Delhi),[6] Econophysics Colloquium, ESHIA/ WEHIA.
In recent years network science, heavily reliant on analogies from statistical mechanics, has been applied to the study of productive systems. That is the case with the works done at the Santa Fe Institute in European Funded Research Projects as Forecasting Financial Crises and the Harvard-MIT Observatory of Economic Complexity
Basic tools of econophysics are probabilistic and statistical methods often taken from statistical physics.
Physics models that have been applied in economics include the kinetic theory of gas (called the kinetic exchange models of markets[7]), percolation models, chaotic models developed to study cardiac arrest, and models with self-organizing criticality as well as other models developed for earthquake prediction.[8] Moreover, there have been attempts to use the mathematical theory of complexity and information theory, as developed by many scientists among whom are Murray Gell-Mann and Claude E. Shannon, respectively.
For potential games, it has been shown that an emergence-producing equilibrium based on information via Shannon information entropy produces the same equilibrium measure (Gibbs measure from statistical mechanics) as a stochastic dynamical equation which represents noisy decisions, both of which are based on bounded rationality models used by economists.[9] The fluctuation-dissipation theorem connects the two to establish a concrete correspondence of "temperature", "entropy", "free potential/energy", and other physics notions to an economics system. The statistical mechanics model is not constructed a-priori - it is a result of a boundedly rational assumption and modeling on existing neoclassical models. It has been used to prove the "inevitability of collusion" result of Huw Dixon[10] in a case for which the neoclassical version of the model does not predict collusion.[11] Here the demand is increasing, as with Veblen goods, stock buyers with the "hot hand" fallacy preferring to buy more successful stocks and sell those that are less successful,[12] or among short traders during a short squeeze as occurred with the WallStreetBets group's collusion to drive up GameStop stock price in 2021.[13] Nobel laureate and founder of experimental economics Vernon L. Smith has used econophysics to model sociability via implementation of ideas in Humanomics. There, noisy decision making and interaction parameters that facilitate the social action responses of reward and punishment result in spin glass models identical to those in physics. [14]
Quantifiers derived from information theory were used in several papers by econophysicist Aurelio F. Bariviera and coauthors in order to assess the degree in the informational efficiency of stock markets.[15] Zunino et al. use an innovative statistical tool in the financial literature: the complexity-entropy causality plane. This Cartesian representation establish an efficiency ranking of different markets and distinguish different bond market dynamics. It was found that more developed countries have stock markets with higher entropy and lower complexity, while those markets from emerging countries have lower entropy and higher complexity. Moreover, the authors conclude that the classification derived from the complexity-entropy causality plane is consistent with the qualifications assigned by major rating companies to the sovereign instruments. A similar study developed by Bariviera et al.[16] explore the relationship between credit ratings and informational efficiency of a sample of corporate bonds of US oil and energy companies using also the complexity–entropy causality plane. They find that this classification agrees with the credit ratings assigned by Moody's.
Another good example is random matrix theory, which can be used to identify the noise in financial correlation matrices. One paper has argued that this technique can improve the performance of portfolios, e.g., in applied in portfolio optimization.[17]
The ideology of econophysics is embodied in a new probabilistic economic theory and, on its basis, a unified theory of stock markets. [18] [19]
There are also analogies between finance theory and diffusion theory. For instance, the Black–Scholes equation for option pricing is a diffusion-advection equation (see however [20][21] for a critique of the Black–Scholes methodology). The Black–Scholes theory can be extended to provide an analytical theory of main factors in economic activities.[22]
Various other tools from physics that have so far been used, such as fluid dynamics, classical mechanics and quantum mechanics (including so-called classical economy, quantum economics and quantum finance),[18] and the Feynman–Kac formula of statistical mechanics.[22]: 44 [23]
When mathematician Mark Kac attended a lecture by Richard Feynman he realized their work overlapped.[24] Together they worked out a new approach to solving stochastic differential equations.[25] Their approach is used to efficiently calculate solutions to the Black–Scholes equation to price options on stocks.[26]
Quantum statistical models have been successfully applied to finance by several groups of econophysicists using different approaches, but the origin of their success may not be due to quantum analogies.[27]: 668 [28]: 969
The editorial in the inaugural issue of the journal Quantum Economics and Finance says: "Quantum economics and finance is the application of probability based on projective geometry—also known as quantum probability—to modelling in economics and finance. It draws on related areas such as quantum cognition, quantum game theory, quantum computing, and quantum physics."[29] In his overview article in the same issue, David Orrell outlines how neoclassical economics benefited from the concepts of classical mechanics, and yet concepts of quantum mechanics "apparently left economics untouched".[30] He reviews different avenues for quantum economics, some of which he notes are contradictory, settling on "quantum economics therefore needs to take a different kind of leaf from the book of quantum physics, by adopting quantum methods, not because they appear natural or elegant or come pre-approved by some higher authority or bear resemblance to something else, but because they capture in a useful way the most basic properties of what is being studied."
Econophysics is having some impacts on the more applied field of quantitative finance, whose scope and aims significantly differ from those of economic theory. Various econophysicists have introduced models for price fluctuations in physics of financial markets or original points of view on established models.[20][31][32]
Presently, one of the main results of econophysics comprises the explanation of the "fat tails" in the distribution of many kinds of financial data as a universal self-similar scaling property (i.e. scale invariant over many orders of magnitude in the data),[33] arising from the tendency of individual market competitors, or of aggregates of them, to exploit systematically and optimally the prevailing "microtrends" (e.g., rising or falling prices). These "fat tails" are not only mathematically important, because they comprise the risks, which may be on the one hand, very small such that one may tend to neglect them, but which - on the other hand - are not negligible at all, i.e. they can never be made exponentially tiny, but instead follow a measurable algebraically decreasing power law, for example with a failure probability of only where x is an increasingly large variable in the tail region of the distribution considered (i.e. a price statistics with much more than 108 data). I.e., the events considered are not simply "outliers" but must really be taken into account and cannot be "insured away".[34] It appears that it also plays a role that near a change of the tendency (e.g. from falling to rising prices) there are typical "panic reactions" of the selling or buying agents with algebraically increasing bargain rapidities and volumes.[34]
As in quantum field theory the "fat tails" can be obtained by complicated "nonperturbative" methods, mainly by numerical ones, since they contain the deviations from the usual Gaussian approximations, e.g. the Black–Scholes theory. Fat tails can, however, also be due to other phenomena, such as a random number of terms in the central-limit theorem, or any number of other, non-econophysics models. Due to the difficulty in testing such models, they have received less attention in traditional economic analysis.
In 2006 economists Mauro Gallegati, Steve Keen, Thomas Lux, and Paul Ormerod, published a critique of econophysics.[35][36] They cite important empirical contributions primarily in the areas of finance and industrial economics, but list four concerns with work in the field: lack of awareness of economics work, resistance to rigor, a misplaced belief in universal empirical regularity, and inappropriate models.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.