Concurrent computing
Executing several computations during overlapping time periods / From Wikipedia, the free encyclopedia
For the American computer company, see Concurrent Computer Corporation. For a more theoretical discussion, see Concurrency (computer science).
Concurrent computing is a form of computing in which several computations are executed concurrently—during overlapping time periods—instead of sequentially—with one completing before the next starts.
This article needs additional citations for verification. (February 2014) |
This is a property of a system—whether a program, computer, or a network—where there is a separate execution point or "thread of control" for each process. A concurrent system is one where a computation can advance without waiting for all other computations to complete.[1]
Concurrent computing is a form of modular programming. In its paradigm an overall computation is factored into subcomputations that may be executed concurrently. Pioneers in the field of concurrent computing include Edsger Dijkstra, Per Brinch Hansen, and C.A.R. Hoare.[2]