Remove ads
form of parallelization of computer code From Wikipedia, the free encyclopedia
Task parallelism (also known as Thread level parallelism, function parallelism and control parallelism) is a form of parallel computing for multiple processors using a technique for distributing execution of processes and threads across different parallel processor nodes. It contrasts to data parallelism as another form of parallelism.
In a multiprocessor system, task parallelism is achieved when each processor executes a different thread (or process) on the same or different data. The threads may execute the same or different code. Different execution threads communicate with one another usually to pass data as they work.
As a simple example, if we are running code on a 2-processor system (CPUs "a" & "b") in a parallel computing environment and we want to do tasks "A" and "B", it is possible to tell CPU "a" to do task "A" and CPU "b" to do task 'B" simultaneously (at the same time), in order to reduce the runtime of the execution.
Task parallelism is used by multi-user and multitasking operating systems, and applications depending on processes and threads, unlike data processing applications (see data parallelism). Most real programs use a combination of Task parallelism and Data parallelism.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.