431 Error in GPT4o
Hi While sending an image url as body for GPT4o, I am getting "431 Request header Filed Too Large". The same request with same image url works in gpt4 turbo. I am not able to find the exact cause for this. What could be the reason behind this…
"tool_choice": "required" sometimes not supported in Azure OpenAI service in 2024-07-01-preview
I am experimenting with the new API version 2024-07-01-preview which supports "tool_choice": "required" . I noticed that this option is intermittently not supported (i.e. when I send the same request repeatedly, I sometimes receive…
How many concurrent threads can be ran using one assistant.
I have 100+ applications. Each application can have many user. I am considering to use one assistant for one application. I want to know the number of threads each assistant can use.
Request for Billing Adjustment Due to Unintentional Over-Provisioning of GPT-4
Dear Microsoft Support Team, I am writing to you in a state of deep concern and urgency. As a first-time user of Azure OpenAI, I mistakenly provisioned the GPT-4 model with the 'Provisioned-Managed' option, setting it to 100 PTU. Unfortunately, I did not…
Inquiry Regarding Data Processing Regions for Azure OpenAI GPT-4o Model
Hello, We are planning to use the GPT-4o model from Azure OpenAI Service for an internal project. However, since the model is not available in the Japan East region, we must use the Global-Standard region. To comply with our company's export control…
Why is the token usage event available in the stream from GPT4o-mini and not from GPT4-o
I noticed that an event.usage is available when answers are streamed from openai This is great because we can now finally store the actual token usage per question However: this feature only seams to be available in gpt4o-mini and not in gpt4-o To…
Retrieving token usage in Azure OpenAI response when streaming is enabled
I have an Azure OpenAI deployment used by multiple internal users that charges back based on token usage found in the "usage" field of the API response. However, users who stream the response with "stream=True" do not receive the…
Getting useful content filtering result for troubleshooting in Azure OpenAI
Hello there, I'm calling Azure Open AI from Power Automate and want to see what's causing the trouble when prompt completion is stopped by (none other than) the content filter. However the returned body is not showing anything useful. Below is a sample…
Regarding the embeddings.openai process in AI Studio indexing, encountering an error that says 'public access disabled'. How can I resolve this issue?
Hello, I am encountering an issue in Azure AI Studio. After uploading data and attempting to index it, the process fails during the LLM-Crack, Chunk, and Embed Data step, with the following error log: INFO azureml.rag.embeddings.openai - Attempt 0 to…
I am not able to perform inference on images after adding data sources to Azure GPT Vision models (GPT-4). Previously, I was able to do it.
I set up RAG using files available in Azure Blob Storage and did the same with Azure Search. However, after adding data sources, I am not able to provide an image as input for inference. I am getting the following error: openai.BadRequestError: Error…
2024-05-01-preview has "upload file" but URL returns 404 :-(
https://github.com/Azure/azure-rest-api-specs/tree/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/preview/2024-05-01-preview has "upload file" swagger specs! Woohooo. I couldn't find it in the earlier releases The doc at…
Chat Playground unable to understand different non-standard date formats
hi guys, I'm currently using azure's chat playground to build a chatbot to chat with my data. i indexed a table from my db (which is on sql server) with 12 columns. one of the columns is called DATE, which is in the date (yyyy-mm-dd) format. the issue is…
Error with Azure OpenAI 'Bring Your Own Data' Service Preventing Index Creation
Hello I’m having a problem with azure openai and the creation of ai search indexes for my documents using the “bring your own data” service. The problem is that the index is not created with my documents and I get the following error: ” We were unable to…
Can the parent company make OpenAI available to subsidiaries as part of its subscription?
I would like to ask whether employees of subsidiary companies can utilize Azure OpenAI under parent company's subscription, or is it necessary for them to apply separately for access? Could we provide Azure OpenAI access under company A subscription…
Integrating Custom Data into Azure OpenAI Chatbot.
I've successfully created a chatbot using Azure Open AI for my custom input data files. I want a chatbot that handles both data-driven responses (based on your provided data) and out-of-domain responses (using the model's general knowledge). When I am…
Sketch2code
How do I access sketch2code? https://sketch2code.azurewebsites.net/ is consistently returning error. Has the location changed?
Fined tuned GPT 4o min model not working in Chat Playground
Value for cogsvc-openai-model-version is invalid. | Apim-request-id: 5299a6db-9eea-41e8-9cb1-70d6f9ee07f3 ^^ Getting above error when I send a picture with a text to the chat in the playground (using chat). Found that on Aug 1st the same error was there…
Automatic indexer creation is not working in Azure open ai studio
Hi Team, We have been using Azure OpenAI Studio to configure private data stored in an Azure Blob Storage container. Typically, when setting up the "Add Your Data" feature from Azure Blob Storage and selecting a daily or hourly index schedule,…
Azure AI Services File Import From URL Throws 415 Error
This is the request body: Request URL: https://ourapiminstance.azure-api.net/genai/azure/assistants/openai/files/import?api-version=2023-12-01-preview Request Body: { "purpose": "fine-tune", "filename":…
Compatibility of Azure OpenAI's Bring Your Own Data with On-Premise Elasticsearch ?
Hello, I would like to know if Azure OpenAI's Bring Your Own Data feature works with on-premise Elasticsearch, or does Elasticsearch need to be in the cloud? Thank you.