I am developing a python package that consists of 2 modules.
- A “tracking” module that can be used to track/log certain performance metrics of other python scripts.
- A “ui” module (Web App) that is used to visualize the tracked information.
For now the ui module is only planned to be used locally by the user.
So far the high level project structure looks like that:
project/ package_name/ tracking/ ... ui/ frontend/ (create-react-app structure) backend/ data_loader/ static/ routes.py app.py setup.py
I want to use React and “create-react-app” for the frontend and FastAPI + Uvicorn for the backend.
The python package should provide a command line interface to power up the web app (entry-point via setup.py).
Here is where my question comes in. After reading a lot, a common practice seems to be to keep the frontend and backend seperated and use docker-compose and a reverse proxy (e.g. nginx) for easier development. Does that also make sense if it is a locally used app, that lives inside a python package? How would I use this setup inside a python package that is installable by users? Is it reasonably to have an entrypoint that uses the subprocess module to call docker-compose so that the entire web app is fired up locally?
One other option would be, to develop the frontend via “create-react-app” and use a proxy inside the package.json to connect it to the FastAPI backend. Once the development is done the fastapi backend could serve the frontend by calling npm build and building the frontend to a /static dir. Then the entrypoint script of the ui only needs to fire up the backend via uvicorn and the app is running.
Maybe there are some best practices for my usecase that I am not aware of. Thanks in advance for your tips and comments.