Introduction
This article is part of the data structure category
A queue is an abstract and dynamic data structure. Abstract because its behavior determines it, not by the implementation details, and dynamic, because it can adjust its size once created.
A queue implements a method of handling elements named FIFO, first-in first-out, where the first inserted element in the queue is going to be the first one to be picked up; this design makes this data structure excellent for use cases where the order in which the elements must be processed is important.
Implementation
Since a queue is an abstract data structure, in this article the queue will be implemented using a doubly linked list. Another implementation can be found in the circular buffer article, which is a valid and efficient queue also.
The implementation of a queue in this article is a type named Queue, which has four fields:
- head: a pointer to the first element of the queue; when null, it means there are no elements at all.
- tail: a pointer to the last element of the queue; when null, it means there are no elements at all.
- length: to keep track of how many elements the queue is storing without traversing the whole sequence.
- mutex: to ensure that the queue is thread-safe.
In Go, an implementation may look like this:
type Queue struct {
mu sync.Mutex
head *node
tail *node
len uint
}
Along with the Queue type, it is necessary to create a node type as well, which has three fields:
- val: the value stored in the queue
- next: a pointer to the next element; when null, it means there are no elements after
- prev: a pointer to the previous element; when null, it means there are no elements before
type node struct {
val interface{}
next *node
prev *node
}
The node uses an empty interface to store any kind of value.
A queue has two main behaviors:
- Push: add an element at the beginning of the queue
- Pop: get an element from the end of the queue
Push
Depending on the implementation, pushing into a queue can be a cheap operation. It is possible to shape this data structure to traverse any of the elements it contains, achieving good efficiency.
Some elements need to be pushed into an empty queue.
When pushing an element into an empty queue, just place it at the first available position.
Similarly, to add an element into a non-empty queue, place the new element before the last added one, which is A in this case.
A queue can grow indefinitely as long as there is enough memory.
The time complexity for pushing an element into a queue, as already mentioned, depends on its implementation. Using the right one is common to achieve an O(1) time complexity.
A simple Go method can implement this behavior:
func (s *Queue) Push(v interface{}) {
// lock meanwhile the operation is ongoing
d.mu.Lock()
defer d.mu.Unlock()
// increase the counter once the function returns
defer func() { d.len++ }()
// if there are no nodes
// create a new one and set it as head as well as tail
if d.len == 0 {
n := &doublyNode{value: v}
d.head = n
d.tail = n
return
}
// create a new node and link the next one to the current head
n := &doublyNode{value: v, next: d.head}
// set the head to the early created node and link the previous one
d.head, d.head.prev = n, n
}
Pop
Popping an element from a queue can be a cheap operation for the same reason as the pushing method.
Unlike the Push behavior, the Pop one should fail when the queue is empty, returning an error or using the “comma ok” idiom.
Some elements need to be popped from a queue already containing two elements.
To pop an element from a queue, pick the element at the last position, and remove it from the tail before giving it to the caller.
Like the first pop operation, to get the last element from a queue, pick the oldest element and remove it from the queue, in this case, A.
In case of popping from an empty queue, it is necessary to communicate to somehow the caller that no element has been popped out.
The time complexity of popping an element from a queue, it's coupled with its implementation details, but it is common to achieve an O(1) time complexity.
This behavior can be written down in some Go code looking like this:
func (s *Queue) Pop() (interface{}, bool) {
// lock meanwhile the operation is ongoing
d.mu.Lock()
defer d.mu.Unlock()
// if there are no item return false to notify that the operation did not delete any node
if d.len == 0 {
return nil, false
}
// decrease the counter once the function return
defer func() { d.len-- }()
val := d.tail.value
// if there's only one element, unset head & tail
// and return true to notify an element has been popped
if d.len == 1 {
d.head = nil
d.tail = nil
return val, true
}
// promote the second-to-last node as tail
// set the next element of the tail as nil
// and return true to notify a node has been popped
d.tail = d.tail.prev
d.tail.next = nil
return val, true
}
Conclusion
The queue data structure is quite simple to implement and has tons of usages in real life such as a task scheduler (similar to a printer queue in a personal computer), a message broker (RabbitMQ), a call-center and more.
Sometimes it is possible to see other methods, such as Search, part of a queue. When it happens, please, consider using another data structure since a queue has a simple and narrowed down use case.
To end the article, here is a recap of the time complexity of the operations a queue supports:
Operation | Time complexity |
---|---|
Push | O(1) |
Pop | O(1) |