Inter-Process Communication (IPC) lets different processes talk and share data while your program runs. Sharing data is important because sometimes one process needs to know what another is doing or update information both can use.
Manager objects in Python make this sharing easy. They create special shared versions of common Python data types like lists and dictionaries. These objects live in shared memory and multiple processes can safely read and write to them at the same time.
In this article, you will learn how to create and use Manager objects such as managed lists, dictionaries, namespaces, and even custom objects. You’ll see clear examples of how to share and update data across processes, making your programs smarter and more connected.
Setting Up: Imports and Starting a Manager
To work with Manager objects, we first need to import the right modules. We’ll use Python’s multiprocessing
module to handle processes and time
to add pauses in our examples.
Next, we start a Manager
instance. This manager lets us create shared objects like lists and dictionaries that multiple processes can access safely.
Here’s a simple code snippet to set things up:
from multiprocessing import Process, Manager
import time
if __name__ == "__main__":
manager = Manager()
# Example: Create a shared list using the manager
shared_list = manager.list()
print("Manager and shared list are ready to use!")
This code imports what we need and creates a manager. From here, you can create shared objects like shared_list
that work across processes. In the next sections, we’ll see how to use these objects in real scenarios.
Using Managed Lists
A managed list is a special list that lives in shared memory. Multiple processes can read from and write to it safely. You create it using Manager().list()
.
Let’s see a fun example: imagine two people adding items to a shared shopping list at the same time. Each process will add its own items to the list.
from multiprocessing import Process, Manager
import time
def add_fruits(shopping_list):
fruits = ["apples", "bananas", "oranges"]
for fruit in fruits:
shopping_list.append(fruit)
print(f"Added {fruit} to the shopping list.")
time.sleep(0.2)
def add_veggies(shopping_list):
veggies = ["carrots", "lettuce", "tomatoes"]
for veg in veggies:
shopping_list.append(veg)
print(f"Added {veg} to the shopping list.")
time.sleep(0.3)
if __name__ == "__main__":
manager = Manager()
shopping_list = manager.list()
p1 = Process(target=add_fruits, args=(shopping_list,))
p2 = Process(target=add_veggies, args=(shopping_list,))
p1.start()
p2.start()
p1.join()
p2.join()
print("\nFinal shopping list:")
for item in shopping_list:
print(f"- {item}")
In this code, the shopping_list
is a managed list shared between two processes. Each process adds different items, and all changes show up in the same list. Using Manager().list()
makes sharing and updating the list simple and safe.
Using Managed Dictionaries
A managed dictionary works like a regular Python dictionary, but it lives in shared memory. This means multiple processes can safely read and update it at the same time. You create it using Manager().dict()
.
Here’s an example where two processes update a shared inventory dictionary. One process adds apples, the other adds bananas, updating counts as they go.
from multiprocessing import Process, Manager
import time
def add_apples(inventory):
for _ in range(3):
inventory["apples"] = inventory.get("apples", 0) + 1
print(f"Added an apple. Total apples: {inventory['apples']}")
time.sleep(0.2)
def add_bananas(inventory):
for _ in range(5):
inventory["bananas"] = inventory.get("bananas", 0) + 1
print(f"Added a banana. Total bananas: {inventory['bananas']}")
time.sleep(0.1)
if __name__ == "__main__":
manager = Manager()
inventory = manager.dict()
p1 = Process(target=add_apples, args=(inventory,))
p2 = Process(target=add_bananas, args=(inventory,))
p1.start()
p2.start()
p1.join()
p2.join()
print("\nFinal inventory counts:")
for fruit, count in inventory.items():
print(f"- {fruit}: {count}")
In this example, the inventory
dictionary is shared between processes. Each process safely updates its fruit count using inventory[key] = value
. The manager handles synchronization behind the scenes, so the counts stay accurate across processes.
Using Managed Namespaces
A managed namespace is like a shared container for multiple related variables. You create it with Manager().Namespace()
. It lets processes share and update named attributes easily.
Imagine a simple game where two players have stats like health and score. We can use a namespace to share these stats across processes.
from multiprocessing import Process, Manager
import time
def player_one_actions(stats):
for _ in range(3):
stats.health -= 10
stats.score += 5
print(f"Player One: Health={stats.health}, Score={stats.score}")
time.sleep(0.3)
def player_two_actions(stats):
for _ in range(3):
stats.health -= 7
stats.score += 7
print(f"Player Two: Health={stats.health}, Score={stats.score}")
time.sleep(0.4)
if __name__ == "__main__":
manager = Manager()
player_stats = manager.Namespace()
# Initialize shared variables
player_stats.health = 100
player_stats.score = 0
p1 = Process(target=player_one_actions, args=(player_stats,))
p2 = Process(target=player_two_actions, args=(player_stats,))
p1.start()
p2.start()
p1.join()
p2.join()
print(f"\nFinal Player Stats: Health={player_stats.health}, Score={player_stats.score}")
In this example, the player_stats
namespace holds two shared variables: health
and score
. Both processes update these values, and the changes are visible to each other. Using a managed namespace makes sharing multiple related pieces of data simple and organized.
Using Managed Queues and Other Proxy Objects
A managed queue lets multiple processes safely send and receive messages. You create it with Manager().Queue()
. This queue lives in shared memory and works like a normal queue but can be shared between processes.
Here’s a simple example where two producers add messages to the queue and one consumer reads them:
from multiprocessing import Process, Manager
import time
def producer(name, queue):
for i in range(3):
message = f"{name} says hello {i}"
queue.put(message)
print(f"Produced: {message}")
time.sleep(0.2)
def consumer(queue):
for _ in range(6): # Expecting 6 messages total
msg = queue.get()
print(f"Consumed: {msg}")
time.sleep(0.3)
if __name__ == "__main__":
manager = Manager()
shared_queue = manager.Queue()
p1 = Process(target=producer, args=("Producer1", shared_queue))
p2 = Process(target=producer, args=("Producer2", shared_queue))
c = Process(target=consumer, args=(shared_queue,))
p1.start()
p2.start()
c.start()
p1.join()
p2.join()
c.join()
In this example, two producers put messages into the shared queue, and the consumer reads them one by one. The Manager().Queue()
ensures messages pass safely between processes without conflict.
Besides queues, the manager can create other proxy objects for sharing things like locks or events, but queues are one of the most common for passing data safely.
Custom Managed Objects
You can create your own custom class and share its instances between processes using a Manager. To do this properly, you need to register the class with the Manager and explicitly expose the methods you want the proxy to access. This allows the Manager to safely create and manage shared instances of your custom object.
For example, imagine a simple Counter
class with methods to increment, reset, and get the current value. By registering this class with the BaseManager
and exposing these methods, you can create a shared Counter
object that multiple processes can modify and read concurrently.
Here is the complete example:
from multiprocessing import Process
from multiprocessing.managers import BaseManager
import time
class Counter:
def __init__(self):
# Initialize counter value to zero
self.value = 0
def increment(self):
# Increase the counter by one
self.value += 1
def reset(self):
# Reset the counter to zero
self.value = 0
def get_value(self):
# Return the current counter value
return self.value
def worker(counter):
# Each worker increments the counter 5 times,
# printing the value after each increment
for _ in range(5):
counter.increment()
print(f"Counter value: {counter.get_value()}")
time.sleep(0.2)
# Register the Counter class with BaseManager
# Specify which methods should be accessible through the proxy
BaseManager.register('Counter', Counter, exposed=['increment', 'reset', 'get_value'])
if __name__ == "__main__":
# Create and start the manager
manager = BaseManager()
manager.start()
# Create a managed Counter instance shared between processes
shared_counter = manager.Counter()
# Create two processes that will share and modify the counter
p1 = Process(target=worker, args=(shared_counter,))
p2 = Process(target=worker, args=(shared_counter,))
# Start the processes
p1.start()
p2.start()
# Wait for both processes to finish
p1.join()
p2.join()
# Print the final counter value using the exposed method
print(f"\nFinal counter value: {shared_counter.get_value()}")
# Shutdown the manager when done
manager.shutdown()
In this example, two processes each call the increment
method on the shared Counter
instance. Because the Counter
class is registered with the Manager and its methods are exposed, the proxy object allows these method calls across process boundaries. Access to the value
attribute is done via the get_value()
method because direct attribute access is not available on the proxy.
This approach lets you share and manage your own custom objects between processes with ease, just as you would with built-in shared types.
Terminating the Manager
When you finish using a Manager, it is important to properly shut it down to release resources and end the background server process. You can do this by calling the shutdown()
method on your Manager instance.
Here’s a simple example showing how to shut down the Manager after all processes have completed their work:
from multiprocessing.managers import BaseManager
from multiprocessing import Process
import time
class Dummy:
def say_hello(self):
# Simple method to print a message
print("Hello from managed object!")
# Register the Dummy class with BaseManager so it can create managed proxies
BaseManager.register('Dummy', Dummy)
if __name__ == "__main__":
# Create and start the manager
manager = BaseManager()
manager.start()
# Create a managed Dummy instance shared between processes
shared_obj = manager.Dummy()
# Create a process that calls the say_hello method of the shared object
p = Process(target=shared_obj.say_hello)
# Start the process
p.start()
# Wait for the process to finish
p.join()
# Properly shut down the manager to clean up resources
manager.shutdown()
Alternatively, starting with Python 3.8, you can use the Manager as a context manager with the with
statement, which automatically handles starting and shutting down:
from multiprocessing import Manager, Process
import time
def worker(shared_list):
shared_list.append('task done')
if __name__ == "__main__":
with Manager() as manager:
shared_list = manager.list()
p = Process(target=worker, args=(shared_list,))
p.start()
p.join()
print(shared_list)
Using manager.shutdown()
or the context manager ensures your program cleans up properly, avoiding leftover processes or resource leaks.
Full Example: Combining Multiple Manager Objects
Let’s create a fun “Zoo” simulator that uses a managed list, dictionary, and namespace together. Multiple processes will add animals to the list, update counts in the dictionary, and track overall zoo stats in a namespace object.
This example shows how different Manager objects can work together to share and update data safely across processes:
from multiprocessing import Process, Manager
import time
def add_animals(zoo_list, zoo_counts, zoo_stats, animals):
for animal in animals:
# Add animal to the shared list
zoo_list.append(animal)
# Update count for the animal in the shared dictionary
if animal in zoo_counts:
zoo_counts[animal] += 1
else:
zoo_counts[animal] = 1
# Update total animals count in namespace
zoo_stats.total_animals += 1
print(f"Added {animal}. Total animals now: {zoo_stats.total_animals}")
time.sleep(0.3)
if __name__ == "__main__":
with Manager() as manager:
# Managed list to hold animal names
zoo_list = manager.list()
# Managed dictionary to count each type of animal
zoo_counts = manager.dict()
# Managed namespace for zoo stats
zoo_stats = manager.Namespace()
zoo_stats.total_animals = 0
# Two groups of animals arriving at the zoo
animals_group1 = ['lion', 'tiger', 'elephant']
animals_group2 = ['giraffe', 'lion', 'zebra']
# Create processes that add animals
p1 = Process(target=add_animals, args=(zoo_list, zoo_counts, zoo_stats, animals_group1))
p2 = Process(target=add_animals, args=(zoo_list, zoo_counts, zoo_stats, animals_group2))
p1.start()
p2.start()
p1.join()
p2.join()
print("\nFinal Zoo List:", list(zoo_list))
print("Final Zoo Counts:", dict(zoo_counts))
print("Total Animals in Zoo:", zoo_stats.total_animals)
In this example, each process adds animals to the shared zoo_list
, updates the count of each animal type in the shared dictionary zoo_counts
, and updates the total number of animals in the shared namespace zoo_stats
. The Manager makes it simple to share and coordinate complex data structures safely between processes.
Conclusion
Using multiprocessing.Manager
objects makes it easy to share complex Python objects like lists, dictionaries, namespaces, queues, and even custom classes across processes. They handle the communication and synchronization behind the scenes, so you can focus on building logic instead of managing low-level data exchange.
In this article, you learned how to create and use these Manager objects, from basic types like lists and dicts to custom classes registered with BaseManager
. Whether you’re building simulations, data pipelines, or multi-process applications, Manager objects give you a clear and safe way to coordinate data across processes.
Now that you’ve seen how they work, try using Manager objects in your own projects—real-world inter-process communication can be that simple.