Parallel Programming

CS433A

 

3-0-0-9

 

Courses with significant overlap with this course:

Semester of last offering:

Date of approval: dd-mmm-yyyy

Prerequisites:  

Course Contents

  1. Introduction: Why parallel computing; Ubiquity of parallel hardware/multi cores; Processes and threads; Programming models: shared memory and message passing; Speedup and efficiency; Amdahl’s Law.

  2. Introduction to parallel hardware: Multi cores and multiprocessors; shared memory and message passing architectures; cache hierarchy and coherence; sequential consistency.

  3. Introduction to parallel software: Steps involved in developing a parallel program; Dependence analysis; Domain decomposition; Task assignment: static and dynamic; Performance issues: 4C cache misses, inherent and artifactual communication, false sharing, computation to communication ratio as a guiding metric for decomposition, hot spots and staggered communication.

  4. Shared memory parallel programming: Synchronization: Locks and barriers; Hardware primitives for efficient lock implementation; Lock algorithms; Relaxed consistency models; High level language memory models (such Java and/or C++); Memory fences. Developing parallel programs with UNIX fork model: !PC with shared memory and message passing; UNIX semaphore and its aliornone semantic. Example case studies (see note below for some details). Developing parallel programs with POSIX thread library: Thread creation; Thread join; Mutex; Condition variables. Example case studies (see note below for some details). Developing parallel programs with Open MP directives: Parallel for; Parallel section; Static, dynamic, guided, and runtime scheduling; Critical sections and atomic operations; Barriers; Reduction. Example case studies (see note below for some details).

  5. Message passing programming: Distributed memory model; Introduction to message passing interface (MPI); Synchronization as Send/Recvpair; Synchronous and asynchronous Send fRecv; Collective communication: Reduce, Broadcast, Data distribution, Scatter, Gather; MPI derived data types. Example case studies (see note below for some details).

 

Topics  

Instructor(s):
Number of sections:

Tutors for each section:

Schedule for Lectures:

Schedule for Tutorial:

Schedule for Labs:

 
 
 

 

 
Birds at IIT Kanpur
Information for School Children
IITK Radio
Counseling Service