Inter-Process Communication (IPC) means the ways that separate processes talk or share data with each other. Since each process runs independently, they need special tools to exchange information safely.
Python’s multiprocessing.Queue
is one such tool. It allows multiple processes to send and receive data in a safe, organized way, without data getting lost or mixed up.
In this article, we will learn how to create and use queues in Python to pass data smoothly between different processes.
Creating and Using a Basic Queue
Inter-process communication (IPC) allows different processes to share information safely. One of the easiest ways to do this in Python is with multiprocessing.Queue
. A queue works like a line where one process can put data in, and another process can take data out in the same order.
In this example, a producer process places a message into the queue using put()
. Meanwhile, a consumer process waits and retrieves the message using get()
. The queue handles all the synchronization needed, so the processes don’t interfere with each other. This way, data passes safely and reliably from one process to another.
from multiprocessing import Process, Queue
def producer(q):
q.put("Hello from producer!")
def consumer(q):
msg = q.get()
print(f"Consumer received: {msg}")
if __name__ == "__main__":
q = Queue()
p1 = Process(target=producer, args=(q,))
p2 = Process(target=consumer, args=(q,))
p1.start()
p2.start()
p1.join()
p2.join()
Here, the producer sends a greeting to the consumer through the queue. The consumer waits until the message arrives and then prints it. This simple pattern is the foundation for passing data safely between processes in Python.
Passing Multiple Items Using Queue
Queues can easily handle multiple pieces of data, not just one. Whether it’s strings, numbers, or even complex objects, you can send many items one after another through the queue. The consumer process can then receive and process each item in order.
In this example, the producer sends five messages into the queue with a small pause between each to simulate real work. The consumer waits and consumes each message, printing them out as they arrive. The queue ensures the messages are passed smoothly and safely from producer to consumer.
from multiprocessing import Process, Queue
import time
def producer(q):
for i in range(5):
q.put(f"message {i}")
print(f"Produced message {i}")
time.sleep(0.5)
def consumer(q):
for _ in range(5):
msg = q.get()
print(f"Consumed {msg}")
if __name__ == "__main__":
q = Queue()
p1 = Process(target=producer, args=(q,))
p2 = Process(target=consumer, args=(q,))
p1.start()
p2.start()
p1.join()
p2.join()
This shows how multiple items flow easily between processes using the queue, allowing for continuous and ordered data exchange.
Using Queue to Pass Complex Objects
Queues don’t just carry simple strings or numbers—they can pass any Python objects that can be pickled. This means you can send dictionaries, lists, or even instances of your own classes between processes easily.
In this example, the sender puts a dictionary with a name and score into the queue. The receiver takes it out and prints the full dictionary. This shows how flexible queues are, handling complex data without any extra work.
from multiprocessing import Process, Queue
def sender(q):
data = {"name": "Harry", "score": 95}
q.put(data)
def receiver(q):
data = q.get()
print(f"Received data: {data}")
if __name__ == "__main__":
q = Queue()
p1 = Process(target=sender, args=(q,))
p2 = Process(target=receiver, args=(q,))
p1.start()
p2.start()
p1.join()
p2.join()
This example highlights the power of queues to share diverse data types across processes, making inter-process communication simple and flexible.
Multiple Producers and Consumers Sharing One Queue
Queues work great when many producers and consumers need to share data safely. Multiple producers can put items into the queue while several consumers take them out. The queue acts like a buffer, managing the flow so everything stays organized.
In this example, two producers each create three messages, and two consumers each consume three messages. The production and consumption happen in an interleaved way, showing how the queue handles multiple processes smoothly.
from multiprocessing import Process, Queue
import time
import random
def producer(name, q):
for i in range(3):
msg = f"{name} produced {i}"
q.put(msg)
print(msg)
time.sleep(random.uniform(0.1, 0.5))
def consumer(name, q):
for _ in range(3):
msg = q.get()
print(f"{name} consumed: {msg}")
if __name__ == "__main__":
q = Queue()
producers = [Process(target=producer, args=(f"Producer-{i}", q)) for i in range(2)]
consumers = [Process(target=consumer, args=(f"Consumer-{i}", q)) for i in range(2)]
for p in producers + consumers:
p.start()
for p in producers + consumers:
p.join()
This example shows how queues keep data flowing properly between multiple producers and consumers without losing any messages.
Closing and Joining Queues
After a queue finishes sending data, it’s important to close it properly. Calling .close()
on the queue tells it that no more items will be added. Then, .join_thread()
waits for the background queue thread to finish cleaning up.
In this example, the producer puts three items into the queue and then closes it. The consumer keeps getting items until the queue is empty. Using .close()
and .join_thread()
helps ensure the queue finishes smoothly without hanging or errors.
from multiprocessing import Process, Queue
def producer(q):
for i in range(3):
q.put(i) # Put items into the queue for the consumer to get
def consumer(q):
while True:
try:
# Try to get an item from the queue with a timeout
item = q.get(timeout=1)
print(f"Got item: {item}")
except:
# Timeout means no more items expected, exit the loop
break
if __name__ == "__main__":
q = Queue() # Create a multiprocessing queue
p1 = Process(target=producer, args=(q,))
p2 = Process(target=consumer, args=(q,))
p1.start() # Start the producer process
p2.start() # Start the consumer process
p1.join() # Wait until the producer finishes putting all items
q.close() # Close the queue in the main process to signal no more data will be added
q.join_thread() # Wait for the queue’s background thread to finish cleanly
p2.join() # Wait for the consumer to finish processing all items
This shows how to cleanly signal when the queue is done and avoid leftover threads.
Conclusion
In this article, we explored how to use multiprocessing.Queue
to pass data safely between processes in Python. We saw how queues help connect producers and consumers without needing shared memory or locks. From sending simple strings to full Python objects like dictionaries, queues make inter-process communication smooth and reliable.
We also learned how queues support multiple producers and consumers, allowing flexible data flow in both directions. Finally, we looked at how to clean up properly by closing the queue and joining its background thread.
Now it’s your turn—try using queues with your own programs. Whether you’re simulating sensors, building a task pipeline, or just passing messages between workers, queues make coordination clear and easy.