An Introduction to Parallel Programming

An Introduction to Parallel Programming, 1st Edition

An Introduction to Parallel Programming, 1st Edition,Peter Pacheco,ISBN9780123742605


Morgan Kaufmann



240 X 197

The first true undergraduate text in parallel programming, covering OpenMP, MPI, and Pthreads

Print Book


In Stock

Estimated Delivery Time
USD 79.95

Key Features

Key features:

  • Takes a tutorial approach, starting with small programming examples and building progressively to more challenging examples
  • Focuses on designing, debugging and evaluating the performance of distributed and shared-memory programs
  • Explains how to develop parallel programs using MPI, Pthreads, and OpenMP programming models
  • Description

    Author Peter Pacheco uses a tutorial approach to show students how to develop effective parallel programs with MPI, Pthreads, and OpenMP. The first undergraduate text to directly address compiling and running parallel programs on the new multi-core and cluster architecture, An Introduction to Parallel Programming explains how to design, debug, and evaluate the performance of distributed and shared-memory programs. User-friendly exercises teach students how to compile, run and modify example programs.


    Students in undergraduate parallel programming or parallel computing courses designed for the computer science major or as a service course to other departments; professionals with no background in parallel computing.

    Peter Pacheco

    Peter Pacheco received a PhD in mathematics from Florida State University. After completing graduate school, he became one of the first professors in UCLA’s “Program in Computing,” which teaches basic computer science to students at the College of Letters and Sciences there. Since leaving UCLA, he has been on the faculty of the University of San Francisco. At USF Peter has served as chair of the computer science department and is currently chair of the mathematics department. His research is in parallel scientific computing. He has worked on the development of parallel software for circuit simulation, speech recognition, and the simulation of large networks of biologically accurate neurons. Peter has been teaching parallel computing at both the undergraduate and graduate levels for nearly twenty years. He is the author of Parallel Programming with MPI, published by Morgan Kaufmann Publishers.

    Affiliations and Expertise

    University of San Francisco, USA

    An Introduction to Parallel Programming, 1st Edition

    1 Why Parallel Computing

    1.1 Why We Need Ever-Increasing Performance

    1.2 Why We’re Building Parallel Systems

    1.3 Why We Need to Write Parallel Programs

    1.4 How Do We Write Parallel Programs?

    1.5 What We’ll Be Doing

    1.6 Concurrent, Parallel, Distributed

    1.7 The Rest of the Book

    1.8 A Word of Warning

    1.9 Typographical Conventions

    1.10 Summary

    1.11 Exercises

    2 Parallel Hardware and Parallel Software

    2.1 Some Background

    2.2 Modifications to the von Neumann Model

    2.3 Parallel Hardware

    2.4 Parallel Software

    2.5 Input and Output

    2.6 Performance

    2.7 Parallel Program Design

    2.8 Writing and Running Parallel Programs

    2.9 Assumptions

    2.10 Summary

    2.11 Exercises

    3 Distributed Memory Programming with MPI

    3.1 Getting Started

    3.2 The Trapezoidal Rule in MPI

    3.3 Dealing with I/O

    3.4 Collective Communication

    3.5 MPI Derived Datatypes

    3.7 A Parallel Sorting Algorithm

    3.8 Summary

    3.9 Exercises

    3.10 Programming Assignments

    4 Shared Memory Programming with Pthreads

    4.1 Processes, Threads and Pthreads

    4.2 Hello, World

    4.3 Matrix-Vector Multiplication

    4.4 Critical Sections

    4.5 Busy-Waiting

    4.6 Mutexes

    4.7 Producer-Consumer Synchronization and Semaphores

    4.8 Barriers and Condition Variables

    4.9 Read-Write Locks

    4.10 Caches, Cache-Coherence, and False Sharing

    4.11 Thread-Safety

    4.12 Summary

    4.13 Exercises

    4.14 Programming Assignments

    5 Shared Memory Programming with OpenMP

    5.1 Getting Started

    5.2 The Trapezoidal Rule

    5.3 Scope of Variables

    5.4 The Reduction Clause

    5.5 The Parallel For Directive

    5.6 More About Loops in OpenMP: Sorting

    5.7 Scheduling Loops

    5.8 Producers and Consumers

    5.9 Caches, Cache-Coherence, and False Sharing

    5.10 Thread-Safety

    5.11 Summary

    5.12 Exercises

    5.13 Programming Assignments

    6 Parallel Program Development

    6.1 Two N-Body Solvers

    6.2 Tree Search

    6.3 A Word of Caution

    6.4 Which API?

    6.5 Summary

    6.6 Exercises

    6.7 Programming Assignments

    7 Where to Go from Here

    Quotes and reviews

    "Pacheco succeeds in introducing the reader to the key issues and considerations in parallel programming. The simplicity of the examples allows the reader to focus on parallel programming aspects rather than application logic. Including both MPI and Pthreads/OpenMP is a good way to illustrate the differences between message passing and shared-memory programming models. The discussions about analyzing the scalability and efficiency of the resulting parallel programs present a key aspect of developing real parallel programs. Finally, working through the same examples using all three facilities helps make this even more concrete."--W. Hu, ComputingReviews.com

    "[T]his is a well-written book, appropriately targeted at junior undergraduates. Being easily digestible, it makes the difficult task of parallel programming come across a lot less daunting than I have seen in other texts. Admittedly, it is light on theory; however, the most memorable lessons in parallel programming are those learned from mistakes made. With over 100 programming exercises, learning opportunities abound."--Bernard Kuc, ACM’s Computing Reviews.com

    With the coming of multicore processors and the cloud, parallel computing is most certainly not a niche area off in a corner of the computing world. Parallelism has become central to the efficient use of resources, and this new textbook by Peter Pacheco will go a long way toward introducing students early in their academic careers to both the art and practice of parallel computing.

    Duncan Buell
    Department of Computer Science and Engineering
    University of South Carolina

    An Introduction to Parallel Programming illustrates fundamental programming principles in the increasingly important area of shared memory programming using Pthreads and OpenMP and distributed memory programming using MPI. More importantly, it emphasizes good programming practices by indicating potential performance pitfalls. These topics are presented in the context of a variety of disciplines including computer science, physics and mathematics. The chapters include numerous programming exercises that range from easy to very challenging. This is an ideal book for students or professionals looking to learn parallel programming skills or to refresh their knowledge.

    Leigh Little
    Department of Computational Science
    The College at Brockport, The State University of New York

    An Introduction to Parallel Programming is a well written, comprehensive book on the field of parallel computing. Students and practitioners alike will appreciate the relevant, up-to-date information. Peter Pacheco’s very accessible writing style combined with numerous interesting examples keeps the reader’s attention. In a field that races forward at a dizzying pace, this book hangs on for the wild ride covering the ins and outs of parallel hardware and software.

    Kathy J. Liszka
    Department of Computer Science
    University of Akron

    Parallel computing is the future and this book really helps introduce this complicated subject with practical and useful examples.

    Andrew N. Sloss FBCS
    Consultant Engineer, ARM
    Author of ARM System Developer’s Guide

    Discount on all Earth,Environment and Energy Titles | Use Promo Code EARTH
    Shop with Confidence

    Free Shipping around the world
    ▪ Broad range of products
    ▪ 30 days return policy