Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance with background tasks #396

Open
nsheff opened this issue Sep 20, 2024 · 0 comments
Open

Performance with background tasks #396

nsheff opened this issue Sep 20, 2024 · 0 comments

Comments

@nsheff
Copy link
Contributor

nsheff commented Sep 20, 2024

How can we make sure the server stays performant even if a task is requiring some CPU?

https://fastapi.tiangolo.com/deployment/server-workers/

Here's an answer from ChatGPT:


FastAPI can handle asynchronous requests even when running on a single core. The key here is that FastAPI is designed to be asynchronous, meaning that even on a single core, it can interleave tasks using Python's asyncio framework. This allows the server to handle multiple requests concurrently, as long as they aren't CPU-bound.

Here's how this works:

  • Non-blocking operations: When one request is waiting for an I/O operation to complete (like a database call or external API request), the server can switch to handling another request.
  • Concurrency, not parallelism: With one core, FastAPI can perform asynchronous tasks concurrently but not in parallel. So, the benefit of async is mainly realized when requests involve waiting (like I/O).

Example scenario on a single core:

  • Request 1 comes in and initiates a long-running I/O operation (e.g., fetching data from a database).
  • Instead of waiting for the operation to complete, the server can handle Request 2 while Request 1 is still "waiting."
  • Once Request 1’s operation is done, the server can continue processing it.

CPU-bound tasks on a single core:

If your tasks are CPU-bound (like intensive data processing), using async alone won’t improve performance, and the server will process tasks sequentially. For heavy CPU tasks, you'd need multiple processes or threads.

Multiple cores:

To take advantage of multiple cores, you can use a process manager like Uvicorn with workers:

  • Uvicorn with the --workers flag starts multiple worker processes, each running a FastAPI app instance on its own core.
  • For true parallelism across cores, you can start multiple worker processes with Uvicorn or use Python’s concurrent.futures or multiprocessing for CPU-bound parallel tasks.

You don’t need to do anything special in your Python code to take advantage of multiple cores, but you do need to configure Uvicorn or similar to use more workers for CPU-bound operations.

In summary:

  • FastAPI can handle asynchronous requests on a single core, thanks to asyncio.
  • To fully utilize multiple cores for CPU-bound tasks, you need to configure Uvicorn with multiple workers.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant