Discuss the concepts of task decomposition and data decomposition within the context of parallel programming.
- Parallel programming or parrelel computing is the simultaneous use of multiple compute resources to solve a given problem. Parallel program always onsists of simultaneously executing processes, problem decomposition relates to the way in which these processes are formulated. This classification can also be referred to as algorithmic skeletons or parallel programming paradigms. The basic two catagorizations in parallel programming paradigms are task decomposition and data decomposition.
Task decomposition refers to the whole-part composition structure of a task model. In every level of the decomposition individual tasks are heuristically seen as representing those subtasks that are the steps which describe the procedural content of the whole task entity. The association relationship encapsulates this portion of meaning that go across the limits of the individual entities. A task-parallel model focuses on processes, or threads of execution. The processes are often behaviourally dissimiler, which stresses on the need for communication. Task parallelism provides a natural way to express message-passing communication. It is usually classified as MIMD/MPMD or MISD.
Data parallelism emphasizes on redistributing the whole data into the different parallel computing nodes. This is achieved when each processor begins performing the same task on the individual pieces of distributed data. In certain situations, a single execution thread may control the operations on all pieces of data. In others, individual threads control the operation, but overall they run the same code. Data parallelism focuses on the distributed nature of the data(parallelized), as opposed to the processing.
Most real programs fall somewhere on a continuum between task parallelism and data parallelism.