Now after reading the code and all of the included comments, let's review what's happening.
First, the work to be done is spelled out by creating a queue of individual tasks to be completed. Each one of these tasks is actually the argument that will be passed to the work function: doWork(). Then 8 threads are spawned/generated which will work simultaneously to complete the tasks stored in the queue (Note: the threads have to be defined as daemons, and the reason will be explained just after this review).
To actually hook up the work to the threads, the worker/threader function: worker() was created. This function will tell each thread that calls it to keep taking the next task from the queue, complete it then mark it as completed (remember that each "task" from the queue will be the argument(s) which will be used to execute the work function doWork()). the 8 threads will be continuously working in parallel to complete all of the work in the queue. que.join() will prevent the process/program from exiting before all of the work has been completed.
Disclaimer: running too many threads at once might damage your machine. It would at least cause your system to hang. 8 is a pretty safe number though here.
Now why are the threads daemons? The simple reason is so that the threads would be killed as soon as the process/program exits. In the python threading module, all daemon threads are automatically killed (abruptly, not gracefully) when the program ends. If the threads weren't set to daemon, one would have to go through the trouble of keeping track of the threads, and then manually killing them.
There is one more thing that should be pointed out about the program exiting. As it may be noticed in the example from the first recommended reading for threads, a Thread.join() command is used to wait for the threads to completely finish executing before letting the process exit, it "blocks" the process from exiting. However, here something different must be done, Queue.join() is used to wait for all of the enqueued tasks to be completed. Instead of looking at the threads the work to be done is considered instead before exiting. If that work is completed, then the process will be allowed to exit and all of the daemon threads are automatically killed (which a the desired outcome want since all of the work is done anyways).
Slightly More Advanced Example
To build further familiarity with the concept, it would be a good idea try to look at a slightly more advanced example that will give a better idea of how they can be more useful.
Python3
# run with python3.3 onwards
import queue
import threading
lock = threading.Lock() # we will use this thread lock during the work function
# Work function from step 4
def writeToFile(filename, content):
lock.acquire() # set the lock
print("%s is writing %d to %s" % (threading.current_thread().name, content, filename))
with open(filename, "a+") as file:
file.write(str(content) + "\n")
# NOTE: practically speaking, it's inefficient to keep opening the file, but for example sake we will do it
lock.release() # unlock
# Worker/Threader function from step 3
def workGiver(work_queue):
while True:
file = work_queue.get()
for num in range(100):
writeToFile(file, num)
work_queue.task_done()
if __name__ == "__main__":
# Step 1
# Create some work to do in the form of tasks in a queue
# Each of the "tasks" in the queue is an argument for the work function: writeToFile()
work_queue = queue.Queue()
work_queue.put("file1.txt")
work_queue.put("file2.txt")
work_queue.put("file3.txt")
# Step 2
# Spawn/Generate threads to do the work
for i in range(8):
print("SPAWNING thread%d" % i)
thread = threading.Thread(name="Thread{}".format(i), target=workGiver, args=(work_queue,), daemon=True)
thread.start()
# wait for all of the tasks to be completed
work_queue.join()
Let's start with understanding what this code wants to do in the first place. This program aims to create 3 files called file1.txt, file2.txt and file3.txt and populate all of them with the numbers from 0 to 99 (i.e. [0, 99]).
What's new in this code vs the previous one?
  1. A thread lock is used in the Work function (thread locks explained in the recommended reading at the beginning of this article - Source 2 for threads)
  2. The action is different, instead of printing out numbers, a more useful task of generating files simultaneously was achieved.
But that's it. So it's not really all that different, but now using queues in threading starts to show relatively more promise for being useful. More advanced examples would be beyond the scope of this article.
And with that, this article ends. Hopefully through these basic examples it has become clear of how queues can be used in multi-threading in python3.