We are migrating an API from another ESB layer. There is a parameter there to enforce the API execution in a single thread in a single worker. This is required because at the backend enforces 1 single auth token at a time. If 2 requests hit the backend at almost same time (with difference being in millisec) to get the token and use them in subsequent call, invariably one of them will fail with 401(as backed keeps only latest token active and invalidates the older one). When the load is high, this can be an issue. So we had to cache the JWT on first call and then proceed to use it till expiry in a sequential manner by enforcing single concurrency and deploying the API in single worker.
Can something similar be done in Apigee? We already have cache enabled, but everytime it expires, few calls fail with 401 error followed by success.
Regards,
Rajeev S
Solved! Go to Solution.
ok I understand.
It seems your backend could stand to be improved w.r.t. concurrency.
In the meantime, you need to work around that limitation. I don't know of a way, in Apigee, to secure a lock such that at MOST a single call OUT can be made at any one time. If I were doing this I might wrap that token dispensary thing into a Java app or a nodejs app that acts as a singleton, and runt hat app in Cloud Run or similar.
ok I understand.
It seems your backend could stand to be improved w.r.t. concurrency.
In the meantime, you need to work around that limitation. I don't know of a way, in Apigee, to secure a lock such that at MOST a single call OUT can be made at any one time. If I were doing this I might wrap that token dispensary thing into a Java app or a nodejs app that acts as a singleton, and runt hat app in Cloud Run or similar.
Thanks @dchiesa1 for the reply.
Unfortunately the backend is an external API and not in our company's control. Anyways, as middleware and Integration team, we have seen so many of such limitations at source/target systems, that we invariably end up with workarounds rather than source/target systems fixing their limitations.
For now, we have asked the source system to send a zero request that just goes to backend and fetches token. This is cached in Apigee. Once this is successful , the actual batch runs and completes within the TTL of the token. For now this is managed with a workaround.
Regards,
Rajeev S