One of the critical factors in HPC is parallelism. To solve a problem faster or to compute larger data sets, tasks can be divided into several sub-tasks and executed in parallel. The key to supercomputers is not in ultra-powerful microprocessors or very expensive and different components from those we use daily in our homes, but the parallelism factor.
Therefore, our team has created the Introduction to MPI (Message Passing Interface) online course. The goal is to provide an excellent foundation for anybody to build their knowledge of parallel programming.
What will you learn?