../_images/book_cover.jpg

This notebook contains an excerpt from the Python Programming and Numerical Methods - A Guide for Engineers and Scientists, the content is also available at Berkeley Python Numerical Methods.

The copyright of the book belongs to Elsevier. We also have this interactive book online for a better learning experience. The code is released under the MIT license. If you find this content useful, please consider supporting the work on Elsevier or Amazon!

< 12.5 Summary and Problems | Contents | 13.1 Parallel Computing Basics >

Chapter 13. Parallel Your Python


Motivation

Let’s say you want to empty a tank that full of water. If I give you one water pipe, you can connected to the bottom of the tank to let the water out. If I give you another water pipe, using both of the pipes will empty the tank in only half of the time. The more water pipes you can connected to the bottom of the tank, the faster you can empty it. The similar thing exist in modern computers as well. Most of our computers these days have a multi-core processor design, which means that we can process the tasks in parallel fashion like using multiple pipes to drain the tank.

In this chapter, we will introduce your how to run your Python program in parallel so that we can reduce the time of completing the task. For smaller project, the benefit may not seem obvious. But for some complicated projects, the gain is significantly. For example, if I give you a homework that normally you need to wait for 1 hour until you can tell whether your code is correct or not. If you find something is wrong, and you change your code accordingly. Now it is time to wait another hour to find out whether your modification is proper. At the end, you made changes for 20 times, and spend more than a day to get it right. But if you can run your algorithms in a parallel way, the running time of your code may shrink to 5 min. 20 times modification only cost you less than 2 hours. Now you see why we bother to learn how to run our code faster using the parallelization.