I’m working on a big C# application that is currently under development so we have some room for structural refactoring.
The application is divided into 10 microservices, which some of them communicate with each other.
That intercommunication is done via 2 different approaches at the moment: A) using Apache Kafka as an event bus and B) creating an internal facade/service-object approach when need to request some data from another microservice. I want to focus in the later approach.
This facade/service is done by literally copying all the important data types definition and having the HTTP requests hard-coded inside each project (ie. we have local copies – duplicated – of each type related to a given microservice request/response inside each microservice project that needs to communicate with it).
I’m not very well versed in microservices and Rest API, but this seems to me a recipe for disaster. If microservice A is requested by 5 other microservices, and we change A’s API, now we need to update all other 5 microservices projects with all the new data types or changes that was made rather than having some sort of shared library with that facade or something like that.
I know managing API versioning is difficult and painful, but this certainly doesn’t seem right to me.
I didn’t find a straight forward guide that addresses a problem like this.
Can anyone shed some light on this? How should we manage those duplicated data type definitions? And lastly how should we manage versioning properly between those different APIs?