Thankfully, logging is a default module within Python, and we can easily import this into our Python programs like this:
Import logging
Once we've done this, we configure our logging system to log more meaningful log messages to a file within our programs directory, as follows:
logging.basicConfig(filename='myapp.log', level=logging.INFO,
format='%(processName)-10s %(asctime)%s:%(levelname)s:%(message)s')
In the preceding line, we specify the filename that we want to log, and the logging level to INFO. We then specify the format that we want each and every logging message to follow when it's appended to our log file.
Notice that the first thing we add to our format string is %(processName)-10s. This ensures that we have traceability over each and every sub process within our application.
import logging
from multiprocessing import Pool
logging.basicConfig(filename='myapp.log', level=logging.INFO,
format='%(processName)-10s %(asctime)%s:%(levelname)s:%(message)s')
def myTask(n):
logging.info("{} being processed".format(n))
logging.info("Final Result: {}".format(n*2))
return n*2
def main():
with Pool(4) as p:
p.map(myTask, [2,3,4,5,6,])
if __name__ == '__main__':
main()
Upon execution of the preceding code, you should see the following being input into the myapp.logs file.
Every line features the name of the process that has taken on each task as well as the level name and the log information being given as output:
ForkPoolWorker-1 %s:INFO:2 being processed
ForkPoolWorker-1 %s:INFO:Final Result: 4
ForkPoolWorker-2 %s:INFO:3 being processed
ForkPoolWorker-2 %s:INFO:Final Result: 6
ForkPoolWorker-3 %s:INFO:4 being processed
ForkPoolWorker-3 %s:INFO:Final Result: 8
ForkPoolWorker-4 %s:INFO:5 being processed
ForkPoolWorker-4 %s:INFO:Final Result: 10
ForkPoolWorker-1 %s:INFO:6 being processed
ForkPoolWorker-1 %s:INFO:Final Result: 12