diff --git a/python-on-hpc/introduction.md b/python-on-hpc/introduction.md deleted file mode 100644 index 86fa112..0000000 --- a/python-on-hpc/introduction.md +++ /dev/null @@ -1,7 +0,0 @@ -# Introduction - -Parallel computing works by launching multiple processes on a computer that run in parallel. You can run parallel programs in Python using a few different methods, but we'll go over two. - -[The first method](multiprocessing.md) is the most simple and uses a Python package called _multiprocessing._ This method can be used on any computer that has more than one processor (yes, this includes your personal laptop). This package allows you to do things like parallelizing a for loop and can be super helpful in everyday programming. - -We enter the world of MPI (Message Passing Interface) in [the second method](mpi-and-python.md). This is a more involved method than the first and requires setting up a 'world' and assigning ranks to each of the processes that will be involved in our program. diff --git a/python-on-hpc/mpi-and-python.md b/python-on-hpc/mpi-and-python.md deleted file mode 100644 index e598cef..0000000 --- a/python-on-hpc/mpi-and-python.md +++ /dev/null @@ -1,51 +0,0 @@ -# MPI & Python - -### What is MPI? - -MPI stands for Message Passing Interface and is a way of passing messages between multiple computers running a parallel program. - -> **Rank:** Each process in a parallel program has a unique rank, i.e. an integer identifier. The rank value is between 0 and the number of processes - 1. - -The sum of all of these ranks is called the _world size_. - -> **World Size:** Total number of ranks in a parallel program. - -### Types of Communication - -> **Blocking Communication:** Blocking communication halts all activity within a process until a message has been sent or received - -> **Non-Blocking Communication:** Non-blocking communication allows activity within a process to continue, even before a message has been fully set or received. - -#### Point to Point (P2P) - -> **Point-to-Point (P2P) communication:** When only two processes within a parallel program communicate with each other. - -For example, there exists a parallel program that has 10 processes (i.e. the world size is 10). Process 1 sends a message to Process 2 letting it know that it has completed a task, but does not communicate with any other process. This is an example of P2P communication. - -_Blocking P2P Methods_ - -send... - -recv... - -_Non-Blocking P2P Methods_ - -isend... - -irecv... - -#### Collective Communication (CC) - -> **Collective communication (CC)**: A type of communication that involves all or some of the processes in a parallel program. - -For example, we take our same parallel program that has 10 processes. The objective of process 1 is to average a value that is stored on each of the other processes. In this case.process 1 would issue a CC call to the other nodes and _collect_ their values in order to average them. Because process 1 communicated with multiple nodes instead of just 1, this is an example of collective communication. - -_CC Methods_ - -bcast - -scatter - -gather - -#### References & Helpful Sites diff --git a/python-on-hpc/multiprocessing.md b/python-on-hpc/multiprocessing.md deleted file mode 100644 index a1ee037..0000000 --- a/python-on-hpc/multiprocessing.md +++ /dev/null @@ -1,4 +0,0 @@ -# Multiprocessing - -More to come here soon - if you really need single-node multiprocessing right now, you can check out: -https://docs.python.org/3/library/multiprocessing.html