How can I get the endpoint to communicate to the OpenAI service, without `APIConnectionError`?

ali 0 Reputation points
2024-11-02T20:07:33.34+00:00

Hello Microsoft community,

I have followed step 1, 2 and 3 of the Build a custom chat app with the prompt flow SDK tutorial.

  • ✅ Part 1: Set up resources
  • ✅ Part 2: Add data retrieval to a chat app
  • ❓Part 3: Evaluate and deploy a chat app

I successfully followed through Part 1 and Part 2. In Part 3 (https://video2.skills-academy.com/en-us/azure/ai-studio/tutorials/copilot-sdk-evaluate-deploy) I was able to progress until validating the end of the " View evaluation results in AI Studio" section. My issue is with the app deployment.

I can run the deploy.py script without any error. And the endpoint is visible in Azure AI Studio. However, when I send a message I get an error regarding a connection error with the OpenAI API.

User's image

Additional info

  • Resources are located in swedencentral.
  • I have waited over an hour after allocating roles. Problem persists.
  • As noted, the evaluation script runs OK.
  • Model version 1106 (2024-05-01-preview) of gpt-35-turbo is used, since the default version is not available in swedencentral.
  • This is additional information from the log
[2024-11-02 19:55:39 +0000][pfserving-app][INFO] - Start monitoring new request, request_id: 866488f4-6ce3-4f9c-9239-a1e6a4ebc3b8, client_request_id: 32988455-794a-4bf1-b3f3-45b35384c305
[2024-11-02 19:55:39 +0000][pfserving-app][INFO] - Start loading request data...
[2024-11-02 19:55:39 +0000][pfserving-app][INFO] - Received trace context: {'current-span-8c8dd05e-43f8-4a65-bc17-30c630af0ea2': NonRecordingSpan(SpanContext(trace_id=0xc5b743f4f98f451f826cd7e4a4257a2c, span_id=0x7ccf2ab833064c22, trace_flags=0x01, trace_state=[], is_remote=True))}
[2024-11-02 19:55:39 +0000][flowinvoker][INFO] - Validating flow input with data {'chat_input': 'Hello', 'chat_history': [{'inputs': {'chat_input': 'Hello'}, 'outputs': {}}]}
[2024-11-02 19:55:39 +0000][flowinvoker][INFO] - Execute flow with data {'chat_input': 'Hello', 'chat_history': [{'inputs': {'chat_input': 'Hello'}, 'outputs': {}}]}
[2024-11-02 19:55:39 +0000][promptflow.core._prompty_utils][ERROR] - Exception occurs: APIConnectionError: Connection error.
[2024-11-02 19:55:39 +0000][flowinvoker][ERROR] - Flow run failed with error: {'message': "Execution failure in 'get_chat_response': (WrappedOpenAIError) OpenAI API hits APIConnectionError: Connection error. [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]", 'messageFormat': "Execution failure in '{func_name}': {error_type_and_message}", 'messageParameters': {'error_type_and_message': '(WrappedOpenAIError) OpenAI API hits APIConnectionError: Connection error. [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]', 'func_name': 'get_chat_response'}, 'referenceCode': 'Unknown', 'code': 'UserError', 'innerError': {'code': 'ScriptExecutionError', 'innerError': None}, 'additionalInfo': [{'type': 'FlexFlowExecutionErrorDetails', 'info': {'type': 'WrappedOpenAIError', 'message': 'OpenAI API hits APIConnectionError: Connection error. [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]', 'traceback': 'Traceback (most recent call last):\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions\n    yield\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_transports/default.py", line 233, in handle_request\n    resp = self._pool.handle_request(req)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request\n    raise exc from None\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request\n    response = connection.handle_request(\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 99, in handle_request\n    raise exc\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 76, in handle_request\n    stream = self._connect(request)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 122, in _connect\n    stream = self._network_backend.connect_tcp(**kwargs)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 213, in connect_tcp\n    sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/contextlib.py", line 137, in __exit__\n    self.gen.throw(typ, value, traceback)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions\n    raise to_exc(exc) from exc\nhttpcore.ConnectError: [Errno -2] Name or service not known\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/openai/_base_client.py", line 990, in _request\n    response = self._client.send(\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_client.py", line 914, in send\n    response = self._send_handling_auth(\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_client.py", line 942, in _send_handling_auth\n    response = self._send_handling_redirects(\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_client.py", line 979, in _send_handling_redirects\n    response = self._send_single_request(request)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_client.py", line 1015, in _send_single_request\n    response = transport.handle_request(request)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_transports/default.py", line 233, in handle_request\n    resp = self._pool.handle_request(req)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/contextlib.py", line 137, in __exit__\n    self.gen.throw(typ, value, traceback)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions\n    raise mapped_exc(message) from exc\nhttpx.ConnectError: [Errno -2] Name or service not known\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/core/_prompty_utils.py", line 1003, in wrapper\n    return func(*args, **kwargs)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/core/_flow.py", line 451, in __call__\n    response = send_request_to_llm(api_client, self._model.api, params)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/core/_prompty_utils.py", line 197, in send_request_to_llm\n    result = client.chat.completions.create(**parameters)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/tracing/_integrations/_openai_injector.py", line 88, in wrapper\n    return f(*args, **kwargs)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/tracing/_trace.py", line 513, in wrapped\n    output = func(*args, **kwargs)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/openai/_utils/_utils.py", line 274, in wrapper\n    return func(*args, **kwargs)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/openai/resources/chat/completions.py", line 815, in create\n    return self._post(\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/openai/_base_client.py", line 1277, in post\n    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/openai/_base_client.py", line 954, in request\n    return self._request(\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/openai/_base_client.py", line 1024, in _request\n    raise APIConnectionError(request=request) from err\nopenai.APIConnectionError: Connection error.\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/tracing/_trace.py", line 513, in wrapped\n    output = func(*args, **kwargs)\n  File "/var/azureml-app/azureml-models/copilot_flow_model/7/copilot_flow/copilot.py", line 92, in get_chat_response\n    searchQuery = intentPrompty(query=chat_input, chat_history=chat_history)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/tracing/_trace.py", line 513, in wrapped\n    output = func(*args, **kwargs)\n  File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/promptflow/core/_prompty_utils.py", line 1022, in wrapper\n    raise WrappedOpenAIError(e)\npromptflow.core._errors.WrappedOpenAIError: OpenAI API hits APIConnectionError: Connection error. [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\n'}}], 'debugInfo': {'type': 'ScriptExecutionError', 'message': "Execution failure in 'get_chat_response': (WrappedOpenAIError) OpenAI API hits APIConnectionError: Connection error. [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]", 'stackTrace': 'Traceback (most recent call last):\n', 'innerException': None}}
[2024-11-02 19:55:39 +0000][pfserving-app][INFO] - Finish monitoring request, request_id: 866488f4-6ce3-4f9c-9239-a1e6a4ebc3b8, client_request_id: 32988455-794a-4bf1-b3f3-45b35384c305.
Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,203 questions
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
2,895 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.