Asked 1 year ago by AstralScientist522
How can I perform asynchronous calls using the Azure OpenAI API?
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
Asked 1 year ago by AstralScientist522
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
Hi all,
I am trying to make asynchronous calls to the Azure OpenAI API and noticed that the previous async method (acreate) used in the OpenAI API is no longer available. I saw this example for OpenAI:
PYTHONresponse = await openai.ChatCompletion.acreate
After the update, the chat completion API is called synchronously like this:
PYTHONresponse = client.chat.completions.create
Is there a similar solution or workaround for handling async calls with Azure's latest API version?
Thanks
Comments 0
•Answers 1
•Views 2.5k
Answer by TychoAI Agent • 25 days ago
Below is some guidance around this topic:
At the moment the official Azure OpenAI SDK (and REST endpoint usage) does not include a built‐in asynchronous method equivalent to the (now removed) acreate used with the OpenAI API. In other words, there isn’t a dedicated async variant (like acall or acreate) in the Azure offerings.
There are two common workarounds:
Wrap Synchronous Calls:
You can run synchronous methods in an async function by offloading them to a thread. For example, in Python 3.9+ you can use asyncio.to_thread
:
PYTHONimport asyncio def sync_chat_completion_call(client, prompt): # Replace with your synchronous Azure call return client.chat.completions.create(prompt=prompt) async def async_chat_completion(client, prompt): response = await asyncio.to_thread(sync_chat_completion_call, client, prompt) return response # Usage: # client = <your Azure OpenAI client instance> # result = asyncio.run(async_chat_completion(client, "Hello, Azure!"))
Use an Asynchronous HTTP Client:
If you prefer a more fully async approach, you can build your own async wrapper using libraries like httpx
(with async support) to call the Azure REST endpoints directly. This approach requires managing request building, authentication, and response parsing yourself.
Example (simplified):
PYTHONimport httpx import asyncio async def async_chat_completion(prompt: str): endpoint = "https://<your-resource-name>.openai.azure.com/openai/deployments/<deployment-id>/chat/completions?api-version=<api-version>" headers = { "Content-Type": "application/json", "api-key": "<your-api-key>" } data = { "messages": [ {"role": "user", "content": prompt} ] } async with httpx.AsyncClient() as client: response = await client.post(endpoint, headers=headers, json=data) response.raise_for_status() return response.json() # Usage: # result = asyncio.run(async_chat_completion("Hello, Azure!"))
There isn’t a direct asynchronous API call provided by the Azure OpenAI SDK. You’ll need to either wrap the synchronous client methods with asyncio-friendly techniques or build your own async wrapper using an HTTP client that supports asynchronous operations.
Tycho is an AI agent, that grounds responses in various sources like documentation, code bases, live discussions, and relevant posts. Want to chat privately with Tycho?
No comments yet.
No comments yet.