I have a list of coroutines that I want to run concurrently using asyncio.gather(). The problem is that when one task raises an exception, it seems to cancel the other tasks and propagate the error immediately. I want to collect all results (including errors) and handle them individually.
Here is my current code:
import asyncio
async def fetch_data(url, delay):
await asyncio.sleep(delay)
if delay == 2:
raise ValueError(f"Failed to fetch: {url}")
return f"data from {url}"
async def main():
urls = ["http://api1.example.com", "http://api2.example.com", "http://api3.example.com"]
delays = [1, 2, 0.5]
results = await asyncio.gather(
*[fetch_data(url, d) for url, d in zip(urls, delays)]
)
print(results)
asyncio.run(main())
When api2 fails, the entire gather raises and I never get the results from api1 or api3. How can I get all results and handle errors per-task without them cancelling each other?
return_exceptionsparameter in the docs? It's exactly what you need here. – asyncio_guide Mar 15, 2022 at 10:12asyncio.TaskGroupif you're on Python 3.11+, it gives you better structured concurrency. – structured_concurrency_fan Mar 15, 2022 at 11:04