My mistral-large-2407 serverless deployment api is suddenly failing

Michael Jaumann 5 Reputation points
2024-11-11T09:05:13.5966667+00:00

I am using a serverless mistral-large-2407 model. I connect over langchain tothe OpenAI-Api-Endpoint. Since the 08 November 2024 my requests are failing. I connect to the OpenAI-Endpoint in the following way:

from langchain_openai import  ChatOpenAI

llm = ChatOpenAI(
model="mistral-large-2407",
api_key="xxxx",
base_url = "...models.ai.azure.com")                 
llm.invoke("hi")

This error is produced:
Error code: 400 - {'detail': "Extra parameters ['n'] are not allowed when extra-parameters is not set or set to be 'error'. Set extra-parameters to 'pass-through' to pass to the model."}

If i connect with the mistral client:

from langchain_mistralai import ChatMistralAI

llm = ChatMistralAI(model="mistral-large-2407",
           api_key="xxxxx",
           base_url="....models.ai.azure.com")
llm.invoke("hi")

This error is produced:

{"detail":"Extra parameters ['safe_prompt'] are not allowed when extra-parameters is not set or set to be 'error'. Set extra-parameters to 'pass-through' to pass to the model."

Thanks in advance!

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
2,911 questions
{count} vote

2 answers

Sort by: Most helpful
  1. Michael Jaumann 5 Reputation points
    2024-11-12T14:55:16.9933333+00:00

    I have found a solution for this by using the OpenAI-compltible endpoint of the mistral model and setting the extra-parameter header.

    from langchain_openai import  ChatOpenAI
    
    mistral_model = ChatOpenAI(
            default_headers={"extra-parameters": "pass-through"},
            base_url=base_url_mistral,
            api_key=os.getenv("AZURE_MISTRAL_KEY"),
            model=deployment_name_mistral,
        )
    
    # This produces an exception.
    mistral_model.invoke("hi")
    
    

    The extra-parameters header is documented here


  2. AshokPeddakotla-MSFT 34,861 Reputation points
    2024-11-13T13:16:13.81+00:00

    Michael Jaumann I'm glad that your issue is resolved and thank you for posting your solution so that others experiencing the same thing can easily reference this!

    Since the Microsoft Q&A community has a policy that the question author cannot accept their own answer, they can only accept answers by others, I'll repost your solution in case you'd like to Accept the answer.

    Error Message:

    Error code: 400 - {'detail': "Extra parameters ['n'] are not allowed when extra-parameters is not set or set to be 'error'. Set extra-parameters to 'pass-through' to pass to the model."}

    Solution:

    I have found a solution for this by using the OpenAI-compltible endpoint of the mistral model and setting the extra-parameter header.

    from langchain_openai import  ChatOpenAI
    
    mistral_model = ChatOpenAI(
            default_headers={"extra-parameters": "pass-through"},
            base_url=base_url_mistral,
            api_key=os.getenv("AZURE_MISTRAL_KEY"),
            model=deployment_name_mistral,
        )
    
    # This produces an exception.
    mistral_model.invoke("hi")
    
    

    If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.