-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Cannot handle the error due to "During handling of the above exception, another exception occurred:". #4233
Comments
your prompt is raising a content safety alert because the content safety filter consider you have violence in your prompt. content_filter_results={'hate': {'filtered': False, 'severity': 'safe'}, 'self_harm': {'filtered': False, 'severity': 'safe'}, 'sexual': {'filtered': False, 'severity': 'safe'}, 'violence': {'filtered': True, 'severity': 'medium'}})] So it produce an error 400 |
I understand that my prompting is causing a content filter error and a 400 bad request error.
|
@tsujimic what is the error raised by your python caller code? the exception raised is the same - which should be of type Exception (it inherits from the openai |
unable to repro - this works for me from litellm import acompletion, APIError, completion
import asyncio, httpx, traceback
import litellm
async def try_acompletion_error():
model = "azure/gpt-3.5-turbo"
request = httpx.Request(
method="POST",
url="https://azure.com/"
)
exception_to_raise = APIError(
status_code=400,
message="invalid_request_error",
llm_provider="azure",
request=request,
model="gpt-35-turbo",
)
setattr(exception_to_raise, "response", httpx.Response(status_code=400, request=request))
try:
response = completion(
model=model,
messages=[{"role": "user", "content": "Hey"}],
api_version="2023-06-12",
# stream=True,
mock_response=exception_to_raise,
num_retries=0
)
chunks = []
async def data_generator():
async for chunk in response:
chunks.append(chunk)
except Exception as e:
# When "During handling of the above exception, another exception occurred:",
# the error cannot be handled.
print("ERROR CAUGHT! - {}".format(str(e)))
asyncio.run(try_acompletion_error()) output: (base) krrishdholakia@Krrishs-MacBook-Air temp_py_folder % python3 linting_tests.py
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
ERROR CAUGHT! - litellm.BadRequestError: AzureException BadRequestError - litellm.APIError: invalid_request_error
(base) krrishdholakia@Krrishs-MacBook-Air temp_py_folder % |
What happened?
Exception handling using litellm python sdk fails due to
"During handling of the above exception, another exception occurred:".
When a content filter error occurs in Azure OpenAI Service, it sometimes occurs in exception handling.
Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: