I have found a solution for this by using the OpenAI-compltible endpoint of the mistral model and setting the extra-parameter
header.
from langchain_openai import ChatOpenAI
mistral_model = ChatOpenAI(
default_headers={"extra-parameters": "pass-through"},
base_url=base_url_mistral,
api_key=os.getenv("AZURE_MISTRAL_KEY"),
model=deployment_name_mistral,
)
# This produces an exception.
mistral_model.invoke("hi")
The extra-parameters header is documented here