Dear Team,
I am dealing with a scenario where I need to send payloads or files larger than the default 10MB request payload size limit. To work around this limitation, I've implemented streaming, which allows me to successfully upload payloads exceeding 10MB in size.
However, I've noticed that as the payload size increases (for instance, when uploading 1GB files), the response times also significantly increase. This has raised a few concerns about the efficiency and scalability of my approach.
Some of the issues I'm already aware of include:
I have the following questions.
Our current business process involves users uploading substantial documents to our backend system. As part of our efforts to streamline and regulate this process, we are considering the implementation of an API management solution. It is important for us to ensure that any changes introduced do not lead to disruptions or backward compatibility issues.
I am aware of an alternative approach involving the use of 'signed URLs' for handling large payload transfers. However, this method requires a minimum of two requests, which, does not align with our operational constraints.
I'd greatly appreciate any insights, experiences, or advice from the community regarding this situation. Thank you in advance!