Loading AI tools
Randomly determined process From Wikipedia, the free encyclopedia
Stochastic (/stəˈkæstɪk/; from Ancient Greek στόχος (stókhos) 'aim, guess')[1] is the property of being well-described by a random probability distribution.[1] Stochasticity and randomness are technically distinct concepts: the former refers to a modeling approach, while the latter describes phenomena; in everyday conversation, however, these terms are often used interchangeably. In probability theory, the formal concept of a stochastic process is also referred to as a random process.[2][3][4][5][6]
Stochasticity is used in many different fields, including the natural sciences such as biology, technology and engineering fields such as image processing, signal processing, computer science, information theory and telecommunications.[7] chemistry,[8] ecology,[9] neuroscience,[10] physics,[11][12][13][14] and cryptography.[15][16] It is also used in finance (e.g., stochastic oscillator), due to seemingly random changes in the different markets within the financial sector and in medicine, linguistics, music, media, colour theory, botany, manufacturing and geomorphology.[17][18][19]
The word stochastic in English was originally used as an adjective with the definition "pertaining to conjecturing", and stemming from a Greek word meaning "to aim at a mark, guess", and the Oxford English Dictionary gives the year 1662 as its earliest occurrence.[1] In his work on probability Ars Conjectandi, originally published in Latin in 1713, Jakob Bernoulli used the phrase "Ars Conjectandi sive Stochastice", which has been translated to "the art of conjecturing or stochastics".[20] This phrase was used, with reference to Bernoulli, by Ladislaus Bortkiewicz,[21] who in 1917 wrote in German the word Stochastik with a sense meaning random. The term stochastic process first appeared in English in a 1934 paper by Joseph L. Doob.[1] For the term and a specific mathematical definition, Doob cited another 1934 paper, where the term stochastischer Prozeß was used in German by Aleksandr Khinchin,[22][23] though the German term had been used earlier in 1931 by Andrey Kolmogorov.[24]
In the early 1930s, Aleksandr Khinchin gave the first mathematical definition of a stochastic process as a family of random variables indexed by the real line.[25][22][a] Further fundamental work on probability theory and stochastic processes was done by Khinchin as well as other mathematicians such as Andrey Kolmogorov, Joseph Doob, William Feller, Maurice Fréchet, Paul Lévy, Wolfgang Doeblin, and Harald Cramér.[27][28] Decades later Cramér referred to the 1930s as the "heroic period of mathematical probability theory".[28]
In mathematics, the theory of stochastic processes is an important contribution to probability theory,[29] and continues to be an active topic of research for both theory and applications.[30][31][32]
The word stochastic is used to describe other terms and objects in mathematics. Examples include a stochastic matrix, which describes a stochastic process known as a Markov process, and stochastic calculus, which involves differential equations and integrals based on stochastic processes such as the Wiener process, also called the Brownian motion process.
One of the simplest continuous-time stochastic processes is Brownian motion. This was first observed by botanist Robert Brown while looking through a microscope at pollen grains in water.
The Monte Carlo method is a stochastic method popularized by physics researchers Stanisław Ulam, Enrico Fermi, John von Neumann, and Nicholas Metropolis.[33] The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino. Methods of simulation and statistical sampling generally did the opposite: using simulation to test a previously understood deterministic problem. Though examples of an "inverted" approach do exist historically, they were not considered a general method until the popularity of the Monte Carlo method spread.
Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Monte Carlo methods were central to the simulations required for the Manhattan Project, though they were severely limited by the computational tools of the time. Therefore, it was only after electronic computers were first built (from 1945 on) that Monte Carlo methods began to be studied in depth. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The RAND Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.
Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling.
In biological systems the technique of stochastic resonance - introducing stochastic "noise" - has been found to help improve the signal-strength of the internal feedback-loops for balance and other vestibular communication.[34] The technique has helped diabetic and stroke patients with balance control.[35]
Many biochemical events lend themselves to stochastic analysis. Gene expression, for example, has a stochastic component through the molecular collisions—as during binding and unbinding of RNA polymerase to a gene promoter—via the solution's Brownian motion.
Simonton (2003, Psych Bulletin) argues that creativity in science (of scientists) is a constrained stochastic behaviour such that new theories in all sciences are, at least in part, the product of a stochastic process.[36]
Stochastic ray tracing is the application of Monte Carlo simulation to the computer graphics ray tracing algorithm. "Distributed ray tracing samples the integrand at many randomly chosen points and averages the results to obtain a better approximation. It is essentially an application of the Monte Carlo method to 3D computer graphics, and for this reason is also called Stochastic ray tracing."[citation needed]
Stochastic forensics analyzes computer crime by viewing computers as stochastic steps.
In artificial intelligence, stochastic programs work by using probabilistic methods to solve problems, as in simulated annealing, stochastic neural networks, stochastic optimization, genetic algorithms, and genetic programming. A problem itself may be stochastic as well, as in planning under uncertainty.
The financial markets use stochastic models to represent the seemingly random behaviour of various financial assets, including the random behavior of the price of one currency compared to that of another (such as the price of US Dollar compared to that of the Euro), and also to represent random behaviour of interest rates. These models are then used by financial analysts to value options on stock prices, bond prices, and on interest rates, see Markov models. Moreover, it is at the heart of the insurance industry.
The formation of river meanders has been analyzed as a stochastic process.
Non-deterministic approaches in language studies are largely inspired by the work of Ferdinand de Saussure, for example, in functionalist linguistic theory, which argues that competence is based on performance.[37][38] This distinction in functional theories of grammar should be carefully distinguished from the langue and parole distinction. To the extent that linguistic knowledge is constituted by experience with language, grammar is argued to be probabilistic and variable rather than fixed and absolute. This conception of grammar as probabilistic and variable follows from the idea that one's competence changes in accordance with one's experience with language. Though this conception has been contested,[39] it has also provided the foundation for modern statistical natural language processing[40] and for theories of language learning and change.[41]
Manufacturing processes are assumed to be stochastic processes. This assumption is largely valid for either continuous or batch manufacturing processes. Testing and monitoring of the process is recorded using a process control chart which plots a given process control parameter over time. Typically a dozen or many more parameters will be tracked simultaneously. Statistical models are used to define limit lines which define when corrective actions must be taken to bring the process back to its intended operational window.
This same approach is used in the service industry where parameters are replaced by processes related to service level agreements.
The marketing and the changing movement of audience tastes and preferences, as well as the solicitation of and the scientific appeal of certain film and television debuts (i.e., their opening weekends, word-of-mouth, top-of-mind knowledge among surveyed groups, star name recognition and other elements of social media outreach and advertising), are determined in part by stochastic modeling. A recent attempt at repeat business analysis was done by Japanese scholars[citation needed] and is part of the Cinematic Contagion Systems patented by Geneva Media Holdings, and such modeling has been used in data collection from the time of the original Nielsen ratings to modern studio and television test audiences.
Stochastic effect, or "chance effect" is one classification of radiation effects that refers to the random, statistical nature of the damage. In contrast to the deterministic effect, severity is independent of dose. Only the probability of an effect increases with dose.
In music, mathematical processes based on probability can generate stochastic elements.
Stochastic processes may be used in music to compose a fixed piece or may be produced in performance. Stochastic music was pioneered by Iannis Xenakis, who coined the term stochastic music. Specific examples of mathematics, statistics, and physics applied to music composition are the use of the statistical mechanics of gases in Pithoprakta, statistical distribution of points on a plane in Diamorphoses, minimal constraints in Achorripsis, the normal distribution in ST/10 and Atrées, Markov chains in Analogiques, game theory in Duel and Stratégie, group theory in Nomos Alpha (for Siegfried Palm), set theory in Herma and Eonta,[42] and Brownian motion in N'Shima.[citation needed] Xenakis frequently used computers to produce his scores, such as the ST series including Morsima-Amorsima and Atrées, and founded CEMAMu. Earlier, John Cage and others had composed aleatoric or indeterminate music, which is created by chance processes but does not have the strict mathematical basis (Cage's Music of Changes, for example, uses a system of charts based on the I-Ching). Lejaren Hiller and Leonard Issacson used generative grammars and Markov chains in their 1957 Illiac Suite. Modern electronic music production techniques make these processes relatively simple to implement, and many hardware devices such as synthesizers and drum machines incorporate randomization features. Generative music techniques are therefore readily accessible to composers, performers, and producers.
Stochastic social science theory is similar to systems theory in that events are interactions of systems, although with a marked emphasis on unconscious processes. The event creates its own conditions of possibility, rendering it unpredictable if simply for the number of variables involved. Stochastic social science theory can be seen as an elaboration of a kind of 'third axis' in which to situate human behavior alongside the traditional 'nature vs. nurture' opposition. See Julia Kristeva on her usage of the 'semiotic', Luce Irigaray on reverse Heideggerian epistemology, and Pierre Bourdieu on polythetic space for examples of stochastic social science theory.[citation needed]
The term stochastic terrorism has come into frequent use[43] with regard to lone wolf terrorism. The terms "Scripted Violence" and "Stochastic Terrorism" are linked in a "cause <> effect" relationship. "Scripted violence" rhetoric can result in an act of "stochastic terrorism". The phrase "scripted violence" has been used in social science since at least 2002.[44]
Author David Neiwert, who wrote the book Alt-America, told Salon interviewer Chauncey Devega:
Scripted violence is where a person who has a national platform describes the kind of violence that they want to be carried out. He identifies the targets and leaves it up to the listeners to carry out this violence. It is a form of terrorism. It is an act and a social phenomenon where there is an agreement to inflict massive violence on a whole segment of society. Again, this violence is led by people in high-profile positions in the media and the government. They're the ones who do the scripting, and it is ordinary people who carry it out.
Think of it like Charles Manson and his followers. Manson wrote the script; he didn't commit any of those murders. He just had his followers carry them out.[45]
When color reproductions are made, the image is separated into its component colors by taking multiple photographs filtered for each color. One resultant film or plate represents each of the cyan, magenta, yellow, and black data. Color printing is a binary system, where ink is either present or not present, so all color separations to be printed must be translated into dots at some stage of the work-flow. Traditional line screens which are amplitude modulated had problems with moiré but were used until stochastic screening became available. A stochastic (or frequency modulated) dot pattern creates a sharper image.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.