Quantcast
Channel: Planet Python
Viewing all articles
Browse latest Browse all 22462

Real Python: Python Stacks, Queues, and Priority Queues in Practice

$
0
0

Queues are the backbone of numerous algorithms found in games, artificial intelligence, satellite navigation, and task scheduling. They’re among the top abstract data types that computer science students learn early in their education. At the same time, software engineers often leverage higher-level message queues to achieve better scalability of a microservice architecture. Plus, using queues in Python is simply fun!

Python provides a few built-in flavors of queues that you’ll see in action in this tutorial. You’re also going to get a quick primer on the theory of queues and their types. Finally, you’ll take a look at some external libraries for connecting to popular message brokers available on major cloud platform providers.

In this tutorial, you’ll learn how to:

  • Differentiate between various types of queues
  • Implement the queue data type in Python
  • Solve practical problems by applying the right queue
  • Use Python’s thread-safe, asynchronous, and interprocess queues
  • Integrate Python with distributed message queue brokers through libraries

To get the most out of this tutorial, you should be familiar with Python’s sequence types, such as lists and tuples, and the higher-level collections in the standard library.

You can download the complete source code for this tutorial with the associated sample data by clicking the link in the box below:

Get Source Code:Click here to get access to the source code and sample data that you’ll use to explore queues in Python.

Learning About the Types of Queues

A queue is an abstract data type that represents a sequence of elements arranged according to a set of rules. In this section, you’ll learn about the most common types of queues and their corresponding element arrangement rules. At the very least, every queue provides operations for adding and removing elements in constant time or O(1) using the Big O notation. That means both operations should be instantaneous regardless of the queue’s size.

Some queues may support other, more specific operations. It’s time to learn more about them!

Queue: First-In, First-Out (FIFO)

The word queue can have different meanings depending on the context. However, when people refer to a queue without using any qualifiers, they usually mean a FIFO queue, which resembles a line that you might find at a grocery checkout or tourist attraction:

Tourists Queuing Up to Enter the American Museum of Natural History in New YorkTourists Queuing Up to Enter the American Museum of Natural History in New York

Note that unlike the line in the photo, where people are clustering side by side, a queue in a strict sense will be single file, with people admitted one at a time.

FIFO is short for first-in, first-out, which describes the flow of elements through the queue. Elements in such a queue will be processed on a first-come, first-served basis, which is how most real-life queues work. To better visualize the element movement in a FIFO queue, have a look at the following animation:

Unbounded FIFO Queue

Notice that, at any given time, a new element is only allowed to join the queue on one end called the tail—which is on the right in this example—while the oldest element must leave the queue from the opposite end. When an element leaves the queue, then all of its followers shift by exactly one position towards the head of the queue. These few rules ensure that elements are processed in the order of their arrival.

Note: You can think of elements in a FIFO queue as cars stopping at a traffic light.

Adding an element to the FIFO queue is commonly referred to as an enqueue operation, while retrieving one from it is known as a dequeue operation. Don’t confuse a dequeue operation with the deque (double-ended queue) data type that you’ll learn about later!

Enqueuing and dequeuing are two independent operations that may be taking place at different speeds. This fact makes FIFO queues the perfect tool for buffering data in streaming scenarios and for scheduling tasks that need to wait until some shared resource becomes available. For example, a web server flooded with HTTP requests might place them in a queue instead of immediately rejecting them with an error.

Note: In programs that leverage concurrency, a FIFO queue often becomes the shared resource itself to facilitate two-way communication between asynchronous workers. By temporarily locking the read or write access to its elements, a blocking queue can elegantly coordinate a pool of producers and a pool of consumers. You’ll find more information about this use case in later sections about queues in multithreading and multiprocessing.

Another point worth noting about the queue depicted above is that it can grow without bounds as new elements arrive. Picture a checkout line stretching to the back of the store during a busy shopping season! In some situations, however, you might prefer to work with a bounded queue that has a fixed capacity known upfront. A bounded queue can help to keep scarce resources under control in two ways:

  1. By irreversibly rejecting elements that don’t fit
  2. By overwriting the oldest element in the queue

Under the first strategy, once a FIFO queue becomes saturated, it won’t take any more elements until others leave the queue to make some space. You can see an animated example of how this works below:

Bounded FIFO Queue (Bounce)

Read the full article at https://realpython.com/queue-in-python/ »


[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]


Viewing all articles
Browse latest Browse all 22462

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>