How to fix error for Pas as you go inference for Llama3 through Azure ML/AI?

Shamus Sim 0 Reputation points
2024-04-19T04:15:55+00:00

I get the same error for both Azure AI Studio and Azure ML StudioUser's image

User's image

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
2,577 questions
Azure Startups
Azure Startups
Azure: A cloud computing platform and infrastructure for building, deploying and managing applications and services through a worldwide network of Microsoft-managed datacenters.Startups: Companies that are in their initial stages of business and typically developing a business model and seeking financing.
236 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AshokPeddakotla-MSFT 30,066 Reputation points
    2024-04-19T06:04:17.3333333+00:00

    Shamus Sim Greetings & Welcome to Microsoft Q&A forum!

    I understand that you would like to deploy Meta-Llama-3-8B-Instruct model and seeing the error as model is not available.

    Please note that For Meta Llama models, the pay-as-you-go model deployment offering is only available with AI hubs created in East US 2 and West US 3 regions.

    Regarding the error message, it seems that Meta-Llama-3-8B-Instruct model is not available is not available in the region where you are trying to deploy. In which region you are trying?

    Meta model API endpoints can be created in AI Studio projects to Azure Machine Learning workspaces in EastUS2.

    I just tried in EastUS2 and it works.

    User's image

    Do let me know if you have any further queries.

    0 comments No comments