This page documents what we do in class. It contains C programs that we will discuss in class, reading assignments from our textbook, simple homework exercises that you can work on for practice and exam preparation (not for credit), and links to other sources of information.
The reading assignments and practice problems are listed by the date on which they were assigned. The sample program files are listed by the date on which they were used. You can click on either a .java link and see the source code, or click on a .html link (when available) and see an applet running. When you are viewing the source code, you can use your browser's "File -> Save As..." menu item to save a copy of the file on your computer. It is a good idea for you to "play" with these example programs; compile them, run them, make simple changes to them and compile and run them again.
- Thursday, December 6.
- Here is the second exam, a take home exam, due next Thursday, December 13.
- Thursday, November 29.
- Read Section 3.5 (pages 116-119) in our textbook.
- Here are the MPI code examples from Chapter 3 of the textbook. Be sure to read the four programs
mpi_trap1.c , mpi_trap2.c , mpi_trap3.c , and mpi_trap4.c .
- Here is the MPI version of the "successive over relaxation" problem.
- Tuesday, November 27.
- See the homework page for your second programming assignment.
- Read Sections 3.3 - 3.4 (pages 97-116) in our textbook.
- Here are some MPI communication examples.
- Thursday, November 22.
- Tuesday, November 20.
- Here are some introductory MPI examples.
- Here is a version of the course build environment that will let you compile and run MPI programs on a personal computer. This file needs to be unzipped to your
C:\ drive in order to work correctly.
- Thursday, November 15.
- Read Sections 3.1 - 3.2 (pages 83-97) on MPI in our textbook.
- Here are two good MPI tutorials.
- Tuesday, November 13.
- Here are solutions to the midterm exam.
- Try installing the following version of OpenCL for Intel cpus. If you don't have an OpenCL compatible GPU, then you can use this version of OpenCL to run OpenCL programs. And if you do have a GPU, this gives you another OpenCL platform to experiment with.
- Thursday, November 8.
- We will go over this code today.
- Tuesday, November 6.
- We will go over this code today. The folder contains all the headers and libraries needed to build the examples (see the make files for the compiler commands). But you need to install an OpenCL runtime (OpenCL.dll, etc.) appropriate for your GPU.
- Thursday, November 1.
- Here are some introductions to OpenCL.
- Here is a good explanation of GPU architecture.
- Tuesday, October 30.
- Here is an explanation of two-dimensional arrays in C.
- Use either the program GPU Caps Viewer or the program GPU-Z to determine if you can run OpenCL programs on your computer.
- Thursday, October 25.
- We will go over this code today.
- Tuesday, October 23.
- Read Chapter 5, Sections 5.8-5.11 (pages 241-262) from the textbook.
- Once again, here are the OpemMP code examples from Chapter 5 of the textbook.
- Thursday, October 18.
- According to the syllabus, today is midterm exam day. So today we'll have an exam. But don't panic, it's a take home exam, due next Thursday, October 25.
- Tuesday, October 16.
- No class because of Fall Break
- Thursday, October 4.
- Here is a brief summary of those parts of OpenMP we will use and a brief explanation of the semantics of those parts.
- We will look at these examples today.
- Tuesday, October 2.
- We will look at these examples today.
- Thursday, September 27.
- Read Chapter 5, Sections 5.5-5.7 (pages 224-241) from the textbook.
- Tuesday, September 25.
- Here is a nice intoduction to the main ideas behind OpenMP. It is the first chapter from a very good book on OpenMP. (But the example OpenMP program given on page 20 is wrong. Can you find the error?)
- Here is some online documentation about OpenMP.
- Thursday, September 20.
- Read Chapter 5, Sections 5.1-5.4 (pages 209-224) from the textbook.
- Here are some introductory OpemMP code examples. To compile OpenMP programs, you need to give the GCC compiler the -fopenmp option.
- Here are the OpemMP code examples from Chapter 5 of the textbook.
- Tuesday, September 18.
- Here is a code example that demonstrates "false sharing".
- Tuesday, September 11.
- Read Chapter 4, Sections 4.10-4.12 (pages 190-200) from the textbook.
- Also read Section 2.2.1-2.2.3 (pages 19-23) from the textbook.
- Also read Section 2.3.4 (pages 43-46) from the textbook.
- Here are code examples using "condition variables".
- Thursday, September 6.
- Here is a brief model for how we can think of the implementation of a semaphore.
- Thursday, August 30.
- Read Chapter 4, Section 4.8 (pages 176-181) from the textbook.
- We are going to start studying thread synchronization. Read Chapter 1, Chapter 2, and Sections 3.1 -3.3 (pages 1-19) from the online textbook, "The Little Book of Semaphores".
- Here are code examples for the "producer/consumer" synchronization problem.
- Tuesday, August 28.
- See the homework page for your first programming assignment.
- Read Chapter 4, Sections 4.4 - 4.7 (pages 162-175) from the textbook.
- Here are code examples for the "race condition" synchronization problem.
- Thursday, August 23.
- Read Chapter 4, Sections 4.1 - 4.3 (pages 151-162) from the textbook.
- Here is a good reference for threads and how they fit into the theory of operating systems. Look at the "middle piece", Concurrency, in blue.
- Here are a bunch of example thread programs.
- Tuesday, August 21.
- For today and Thursday, read Chapter 1 from our textbook.
- Read the following introductory chapter (from a different parallel programming textbook).
- Here is the "count 3s" code described in the previous link.
- Here is a third introductory chapter from a parallel programming textbook.
- The following links are to two fairly famous articles that spell out the reasons why parallel programming has become such an essential (yet difficult) tool for software developers. They were written by Herb Sutter.
|
|