Threads, Queues, and Concurrency
Threads, Queues, and Concurrency
Library
Cocoacasts
Sign Up
Sign In
Search Cocoacasts
12:02
Resources
Finished Project
It can be challenging or even overwhelming to prepare for a job interview or project meeting. The number of subjects you need
to be familiar with as a developer is staggering. Mastering the fundamentals of the platform you're developing software for
should always be a key focus area. That is why a signi cant portion of the content published on Cocoacasts focuses on the
fundamentals of Swift and Cocoa development.
This episode zooms in on threading and concurrency. It isn't uncommon for developers to avoid these subjects because they
seem complex or more advanced. It's true that threading and concurrency aren't the easiest concepts to learn, but you need to
understand them to build software that is performant and reliable.
What Is a Thread?
Let's start with a basic question many developers don't have a good answer for. What is a thread? I won't go into the low-level
details of threads and concurrency. While it's undoubtedly interesting, it's less important to understand the nitty gritty details if
you're a Swift or Cocoa developer.
I want to avoid that this episode turns into a theoretical discussion of threading and concurrency. Let's make it more tangible.
Launch Xcode and create a new project by choosing the Single View App template from the iOS section.
Name the project Threads and click Create to create the project.
Run the application in the simulator or on a physical device. With the application running, pause the execution of the
application's process by clicking the Pause button in the Debug Bar at the bottom.
Open the Debug Navigator on the left. The Debug Navigator shows us a list of the threads the application uses to perform its
work.
An application isn't functional unless the code we wrote is executed. A thread is nothing more than a context in which
commands are executed. It's a line or thread of execution through the application we created. The stack trace of a thread
illustrates this. Each frame of the stack trace is a command that is being executed by the thread. You can learn more about
threads and stack traces in Debugging Applications With Xcode.
A thread has many more properties, such as a unique identi er, a stack, and a collection of registers. Don't worry about these
properties for now. They are not important for the rest of this discussion.
Every Cocoa application has a minimum of one thread, the main thread. The main thread is the thread on which the application
starts its life. The topmost thread in the list of threads in the Debug Navigator is the main thread. We talk more about the main
thread in the next episode.
Despite the simplicity of the project, the Debug Navigator shows us that the application uses multiple threads to carry out its
tasks. We could say that the application is multithreaded since it uses more than one thread. Multithreading is a pattern to
increase the performance of an application. By scheduling work on multiple threads, more work can be performed in parallel.
Performing work in parallel is also known as concurrency. Multiple commands are executed in parallel or concurrently.
Multithreading and concurrency have gained in importance over the years because devices have transitioned from having a
single processor with a single core to one or more processors with multiple cores. Taking advantage of this power is important,
but it's also what makes software development complex and challenging. There are several options to manage that complexity.
Grand Central Dispatch is one of them.
As I mentioned earlier, the devices we develop applications for are powered by powerful processors and it’s important to take
advantage of that power. An application needs to be performant and responsive. Achieving these goals can be challenging.
Grand Central Dispatch manages a collection of dispatch queues. They are usually referred to as queues. The work submitted to
these dispatch queues is executed on a pool of threads. As a developer, you don't need to worry about which thread is used to
execute a block of work and Grand Central Dispatch doesn't want you to know. It doesn't offer any guarantees. Which thread is
used for a particular task is an implementation detail of Grand Central Dispatch. We take a closer look at Grand Central
Dispatch in a later episode.
What's important to understand is that Grand Central Dispatch operates at the system level, which means that it has an
accurate overview of which processes are running, which resources are available, and how to best schedule the work that needs
to be done. It's a very powerful solution.
What Is a Queue?
I already mentioned that Grand Central Dispatch manages a collection of dispatch queues. As the name implies, a dispatch
queue is a queue onto which work can be scheduled for execution.
There are two types of dispatch queues, serial dispatch queues and concurrent dispatch queues. It's easy to understand the
difference. A serial queue executes the commands it's given in a predictable order. If you schedule two tasks on a serial dispatch
queue, then the second task is executed after the rst task has completed. The advantage is predictability. The disadvantage is
decreased performance. The second task needs to wait until the rst task has nished executing.
Concurrent queues are different in that they sacri ce predictability for performance. The tasks scheduled on a concurrent
queue are executed in parallel or concurrently. If you schedule two tasks on a concurrent dispatch queue, then the second task
can be scheduled before the rst task has nished executing. This means that it's possible for the second task to nish executing
before the rst one.
Let me illustrate this with a simple example. I've created a simple application that downloads three images and displays each
image in an image view. The application displays two buttons. The button on the left is labeled Serial and performs its work on a
serial dispatch queue. The button on the right is labeled Concurrent and performs its work on a concurrent dispatch queue.
Before I illustrate the difference, let's have a look at the implementation of the ViewController class. It keeps a reference to a
serial dispatch queue and a concurrent dispatch queue. If the left button is tapped, the download(using:) method is invoked and
a reference to the serial dispatch queue is passed in as an argument. If the right button is tapped, the download(using:) method
is invoked and a reference to the concurrent dispatch queue is passed in as an argument. In this example, we explicitly create a
serial and a concurrent dispatch queue. This is ne, but it's more common to ask Grand Central Dispatch for a reference to one
of its dispatch queues. Even though we create a serial and a concurrent dispatch queue, they are managed by Grand Central
Dispatch.
Most interesting is the implementation of the download(using:) method. We reset the image views, start an activity indicator
view for each image view, and schedule a block of work on the dispatch queue that is passed to the download(using:) method. It
isn't a problem if you're not familiar with the API of Grand Central Dispatch. The idea is quite simple.
The application creates a Data instance using the URL instance. The image data is used to create a UIImage instance, which is
assigned to the image property of the image view. Downloading the image data and creating the UIImage instance is done in
the background.
The key difference is that a serial dispatch queue executes the work it's given serially or sequentially. This means that the image
data for the second image is fetched after the image data of the rst image has completed downloading. The image data of the
third image is fetched after the image data of the second image has completed downloading. If that is true, then the rst image
should be displayed rst, then the second image, and nally the third image. In other words, the images are downloaded and
displayed sequentially.
This isn't true if we schedule the work on a concurrent dispatch queue. In that scenario, the tasks we dispatch to the concurrent
queue are executed in parallel. It's possible that the application has nished downloading the image data of the second image
before the image data of the rst and third image have nished downloading. In that scenario, the second image is displayed
before the rst image even though the rst task was scheduled before the second task.
That's enough theory for now. Let's run the application and nd out what happens. If we tap the button on the left, the images
are displayed in a predictable order, from top to bottom. If we tap the button on the right, the images are displayed in an
unpredictable order. I deliberately made the le size of the third image smaller than that of the other two images to illustrate
the difference. Even though the task to fetch the image data of the third image is scheduled last, the third image is displayed
rst.
Main Queue and Global Queues
We manually created a serial and a concurrent dispatch queue in the example. That's ne, but it's more common to ask Grand
Central Dispatch for an existing dispatch queue. Every application has access to a main queue and several background queues.
As the name implies, the main queue is associated with the main thread. What does that mean? Work that is scheduled on the
main queue is guaranteed to be executed on the main thread. This is useful if you need to make sure a task is guaranteed to be
executed on the main thread, such as updating the user interface. We talk more about the main queue and the main thread in
the next episode.
In the example, the application updates the image property of the image views on the main thread by taking advantage of
Grand Central Dispatch. The application asks Grand Central Dispatch for a reference to the main queue and passes it a closure,
a block of work, in which the image property of the image view is set.
Grand Central Dispatch also manages a number of concurrent queues that perform work on a pool of background threads.
These dispatch queues are also known as global queues. Remember that a concurrent queue doesn't care about the order of
execution. Performance is key at the cost of predictability.
What's Next?
I hope that this episode has demysti ed a few more advanced concepts, such as threads, queues, and concurrency. If you take
the time to understand what these concepts entail, then it makes working with technologies like Grand Central Dispatch much
easier. The next episode zooms in on the main thread. What is it and why is it important?
Resources
Finished Project