since instead of a lot of users should wait a bit for the user asking for the backend to do the syncrhonous task to be performed then the user asking for it would potentially wait a long time
No one said the work on the queue was being done synchronously.
- Many such implementations will have a thread pool performing several pieces of work in parallel.
- Other implementations would use a Priority Queue to ensure that quicker jobs were done first
- Even better implementations would use both with a high priority pool and low priority pool for obtaining work on.
The relevance of a queue is to ensure that requests once received are remembered till they can be processed.
Isn’t most webservers implemented with queues in the first place?
Yes but those queues are on the front end, and once your program receives the request the timer starts. You have exactly X seconds to complete processing before the web server terminates the connection. A Queue permits those requests to keep being processed beyond this limit.
Since a server can respond to multiple requests asynchronously?
Also those requests are processed very quickly (assuming a multi-threaded server) which could easily overwhelm your actual resource budget. Delaying work till later is something a queue is good for.
Queues are also useful for transferring longer running work to other back-end servers tooled for intensive processing.
Would this only be relevant if the external service uses another protocol than http then?
No queues are supremely useful data structures. They are also known as:
They are used everywhere.
But more specifically the network protocol has zero to do with how a server should be implemented.