Issue with Concurrent Requests Handling in Flowise with Azure OpenAI #2632
Unanswered
seonghobae
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I am currently using Azure OpenAI to implement Chat Azure OpenAI. However, it seems that Flowise is unable to handle concurrent requests from clients. When clients send multiple requests simultaneously, Flowise does not appear to process these requests in parallel. The logs in Langfuse suggest that concurrent requests are not being handled properly.
Is there a recommended configuration or method to resolve this issue? How can I ensure that Flowise can handle concurrent requests effectively?
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions