Software Engineering Stack Exchange is a question and answer site for professionals, academics, and students working within the systems development life cycle. It only takes a minute to sign up.
Sign up to join this community
Anybody can ask a question
Anybody can answer
The best answers are voted up and rise to the top
I have a monolithic REST API and a separate WebSocket server that’s used for real-time updates. They’re both separate so that I can scale them independently. The API fetches data from the database and sends it to the users who request it, but I also want to fetch data from the database in my WebSocket server and I’m not sure how I should do it in a way that is consistent.
Options: (the ones I thought of)
- Create a shared database module with the schemas and models for my database and use it across both servers.
- On the WebSocket server, I could request the API using the user token to fetch data on their behalf.
- I could split up my API logic into a microservice architecture and have the API act as a gateway with auth logic and proxying requests to the micro services, and the WebSocket server could directly request the microservices without going through the gateway so it doesn’t have to authenticate.
- Fetch the data from the API using RPC over RabbitMQ.
- Move logic that needs to query the database to my API instead (not very ideal)
- For option #1, it’s apparently bad design to do this, and in all fairness I don’t need to fetch a lot of my data in my WebSocket server, only certain fields (that can be cached), and if I decide to implement some form of database cache there would be two separate caches (not to mention both servers should be scaled so there would be multiple instances). I could implement the cache in my shared module but it sounds like a bad idea.
- Option #2 seems pretty sound to me. Although again I don’t need to fetch ALL data like you would when requesting the API, maybe this is overkill?
- Option #3 seems like an okay idea but I’m not sure how to design it with Node/TypeScript, there aren’t many resources for it and I’m also concerned about how I would setup my firewall to prevent external users from being able to access the private microservices, ufw is good enough on a single node but what if I have multiple? and what if I decide to start using Kubernetes? Just seems like a pain in the ass to setup and prone to issues.
- I like option #4 but I’m not sure how fast it’ll be. Since this is a WebSocket server, I need things to happen in real-time as fast possible.
- For option #5, the WebSocket “requests” that need the data are based on the socket’s session, so if I moved them to the API, I would have to use RabbitMQ or a pub/sub to send the requests to my WebSocket server anyway. Seems like a slow roundabout process.
What would be the best way to approach this? Are there other options?