Monkey Patching vs. Async Await A Tale of Two Python Concurrency Paradigms
Wenhao Wang
Dev Intern · Leapcell

Introduction
Python, a language celebrated for its versatility and readability, constantly evolves to meet the demands of modern software development. Concurrency, the ability to execute multiple tasks seemingly simultaneously, is a critical aspect of building responsive and performant applications, especially in today's data-driven, network-intensive world. Over the years, Python developers have adopted various strategies to achieve concurrency. Among these, two highly distinct philosophies have emerged: the dynamic, runtime-altering power of monkey patching, and the more structured, explicit control offered by the async/await syntax. This article will explore these two paradigms, contrasting their underlying mechanisms, typical applications, and the trade-offs involved when choosing one over the other for concurrent operations in Python. Understanding these differences is crucial for making informed architectural decisions that impact the maintainability, scalability, and robustness of your applications.
The World of Monkey Patching and Async/Await
Before diving into the direct comparison, let's establish a clear understanding of the core concepts involved.
Core Terminology
- Concurrency: The ability to handle multiple tasks at once. It doesn't necessarily mean tasks are executing simultaneously (parallelism), but rather that progress can be made on more than one task over a period of time.
- Monkey Patching: A technique to extend or modify the runtime code of a program without changing its original source code. This usually involves replacing methods, classes, or even entire modules at runtime.
async/await: Python's built-in syntax for defining and running coroutines – functions that can be paused and resumed. This non-blocking I/O approach is central to asynchronous programming, allowing a single thread to manage multiple I/O-bound operations effectively.- Event Loop: The core of
asyncioand similar asynchronous frameworks. It's responsible for scheduling and executing coroutines, managing I/O operations, and dispatching events. - Coroutines: Special functions in Python (defined with
async def) that can pause their execution and yield control back to an event loop, allowing other tasks to run. When the I/O operation they were waiting for completes, they can be resumed.
Monkey Patching for Concurrency
Monkey patching for concurrency typically involves libraries like gevent or eventlet. These libraries achieve concurrency by patching standard library functions (e.g., I/O operations like socket or time.sleep) to become "cooperatively scheduled." When a patched I/O operation is called, instead of blocking the entire thread, it yields control to the gevent or eventlet scheduler, which then switches to another "greenlet" (a lightweight coroutine provided by these libraries).
How it works:
- Import and "monkey patch" the standard library:
# gevent example from gevent import monkey monkey.patch_all() # Patches standard library modules like socket, ssl, threading, time, etc. import gevent import requests # Will now use gevent-aware sockets under the hood - Define functions that perform I/O operations as usual. No special
asyncorawaitkeywords are needed. - Use
gevent.spawnoreventlet.spawnto create greenlets andgevent.joinallto wait for them.
Example: Fetching multiple URLs concurrently with gevent:
# gevent_example.py from gevent import monkey import gevent import time import requests # Patch standard library (e.g., socket, time) monkey.patch_all() def fetch_url(url): print(f"Starting to fetch: {url}") try: response = requests.get(url) print(f"Finished fetching: {url}, Status: {response.status_code}") return len(response.content) except Exception as e: print(f"Error fetching {url}: {e}") return 0 urls = [ "http://www.google.com", "http://www.yahoo.com", "http://www.bing.com", "http://www.python.org", ] if __name__ == "__main__": start_time = time.time() # Spawn greenlets for each URL geenlets = [gevent.spawn(fetch_url, url) for url in urls] # Wait for all greenlets to complete gevent.joinall(geenlets) total_bytes = sum(g.value for g in geenlets) end_time = time.time() print(f"\nTotal bytes fetched: {total_bytes}") print(f"Total time taken: {end_time - start_time:.2f} seconds")
In this example, requests.get() ordinarily blocks. However, after monkey.patch_all(), the underlying socket operations become non-blocking and yield control, allowing other fetch_url calls to proceed concurrently in the same thread.
Application Scenarios:
- Migrating existing blocking code to a concurrent model with minimal code changes.
- Integrating with legacy libraries that do not support
async/await. - Web servers (like Gunicorn using
geventoreventletworkers) to handle many concurrent connections.
The async/await World
async/await is Python's explicit native support for asynchronous programming, introduced in Python 3.5 with the asyncio library. It operates on the principle of cooperative multitasking using an event loop. Functions marked with async def are coroutines, and they explicitly indicate points where they might pause their execution using the await keyword.
How it works:
- Define coroutines using
async def. - Inside a coroutine, use
awaitto pause execution until anawaitable(another coroutine, a future, or a task) completes. - An event loop runs these coroutines, switching between them when one awaits an I/O operation.
Example: Fetching multiple URLs concurrently with asyncio and httpx:
# asyncio_example.py import asyncio import httpx # An async-aware HTTP client import time async def fetch_url_async(client, url): print(f"Starting to fetch: {url}") try: response = await client.get(url) # Await the async HTTP GET request print(f"Finished fetching: {url}, Status: {response.status_code}") return len(response.content) except Exception as e: print(f"Error fetching {url}: {e}") return 0 async def main(): urls = [ "http://www.google.com", "http://www.yahoo.com", "http://www.bing.com", "http://www.python.org", ] start_time = time.time() async with httpx.AsyncClient() as client: # Use an async HTTP client tasks = [fetch_url_async(client, url) for url in urls] # Run tasks concurrently and wait for all to complete results = await asyncio.gather(*tasks) total_bytes = sum(results) end_time = time.time() print(f"\nTotal bytes fetched: {total_bytes}") print(f"Total time taken: {end_time - start_time:.2f} seconds") if __name__ == "__main__": asyncio.run(main()) # Run the main coroutine
Here, httpx.AsyncClient().get() is an awaitable. When await client.get(url) is called, fetch_url_async pauses, and the event loop can switch to other pending tasks or start new get requests concurrently.
Application Scenarios:
- Building high-performance network applications (web servers, APIs, chat applications).
- Databases and microservices consuming other asynchronous services.
- Any I/O-bound application where explicit control over concurrency flow is desired.
- Modern web frameworks like FastAPI and Starlette are built entirely around
asyncio.
Comparison and Trade-offs
| Feature | Monkey Patching (e.g., gevent) | Async/Await (asyncio) |
|---|---|---|
| Explicitness | Implicit: standard blocking calls become non-blocking. | Explicit: async and await keywords clearly mark concurrency. |
| Invasiveness | Highly invasive: modifies global state and standard library at runtime. | Non-invasive: relies on explicit async APIs. |
| Learning Curve | Lower for existing blocking code, but understanding patching can be complex. | Higher for beginners, requires understanding event loops and coroutines. |
| Ecosystem Support | Niche, relies on patched versions of libraries. Not all libraries are patchable. | Growing rapidly, many modern libraries are asyncio native. |
| Debugging | Can be challenging due to implicit control flow and modified stack traces. | Generally clearer error messages and stack traces due to explicit control. |
| Performance | Excellent for I/O-bound tasks in a single thread. Similar to asyncio in many cases. | Excellent for I/O-bound tasks. Native support offers potential for optimization. |
| Compatibility | Can be problematic with libraries that do their own low-level I/O or threading. | Requires libraries to be asyncio compatible or provide async wrappers. |
| Mental Model | "It just works" (until it doesn't). | "Explicitly cooperative." |
Where Monkey Patching Shines:
- Legacy Code Migration: When you have a vast existing codebase written with traditional blocking calls, monkey patching can offer a quick way to introduce concurrency without a massive rewrite.
- Rapid Prototyping: For quickly adding concurrency to a non-concurrent application where the overhead of rewriting for
asynciois deemed too high.
Where Async/Await Shines:
- New Projects: For building modern, high-performance applications from scratch where an explicit concurrency model is preferred.
- Robustness and Maintainability: The explicit nature of
async/awaitmakes code easier to reason about, understand, and debug, leading to more maintainable applications in the long run. - Modern Ecosystem: Integrating seamlessly with the growing ecosystem of
asyncio-native libraries and frameworks. - Clearer Control Flow: Developers have finer-grained control over when context switches occur.
Conclusion
Both monkey patching (primarily through libraries like gevent) and async/await offer powerful ways to achieve concurrency in Python. Monkey patching provides an almost magical transition for existing blocking code, allowing it to become concurrently executed with minimal syntax changes, but at the cost of global state modification and potential debugging headaches. In contrast, async/await requires a more explicit and intentional approach to concurrent programming, marking asynchronous operations clearly, leading to more robust, maintainable, and modern code, especially for new projects or complete rewrites. The choice between these two paradigms hinges on the specific project's needs, existing codebase, and the team's comfort level with explicit asynchronous programming, but the trend clearly favors the explicit control and growing ecosystem of async/await. For greenfield projects, embracing async/await is undeniably the path forward for scalable and transparent concurrent Python applications.

