OpenAI Compatibility

Use ASI:One’s API with OpenAI’s client libraries for seamless integration.

Overview

ASI:One’s API is fully compatible with OpenAI’s Chat Completions API format. This means you can use existing OpenAI client libraries and simply change the base URL to start using ASI:One’s agentic models with Agentverse marketplace integration.

API Compatibility

Standard OpenAI Parameters

These parameters work exactly the same as OpenAI’s API:

  • model - Model name (use ASI:One model names)
  • messages - Chat messages array
  • temperature - Sampling temperature (0-2)
  • max_tokens - Maximum tokens in response
  • top_p - Nucleus sampling parameter
  • frequency_penalty - Frequency penalty (-2.0 to 2.0)
  • presence_penalty - Presence penalty (-2.0 to 2.0)
  • stream - Enable streaming responses

ASI:One-Specific Parameters

These ASI:One-specific parameters are also supported:

  • web_search - Enable web search capabilities
  • x-session-id - Session ID for agentic model persistence (header)
  • Tool calling parameters for Agentverse marketplace agent integration

See API Reference for complete parameter details.

Examples with OpenAI SDK

Install the OpenAI library: pip install openai

1# Complete Request & Response
2from openai import OpenAI
3
4client = OpenAI(
5 api_key="YOUR_ASI_ONE_API_KEY",
6 base_url="https://api.asi1.ai/v1"
7)
8
9response = client.chat.completions.create(
10 model="asi1-mini",
11 messages=[
12 {"role": "system", "content": "Be precise and concise."},
13 {"role": "user", "content": "What is agentic AI and how does it work?"}
14 ],
15 temperature=0.2,
16 top_p=0.9,
17 max_tokens=1000,
18 presence_penalty=0,
19 frequency_penalty=0,
20 stream=False,
21 extra_body={
22 "web_search": False
23 }
24)
25
26print(response.choices[0].message.content)
27print(f"Usage: {response.usage}")
1# Agentic Model with Session - Working Example
2import uuid
3from openai import OpenAI
4
5client = OpenAI(
6 api_key="YOUR_ASI_ONE_API_KEY",
7 base_url="https://api.asi1.ai/v1"
8)
9
10# Generate session ID for agentic models
11session_id = str(uuid.uuid4())
12
13print(f"🆔 Session ID: {session_id}")
14print("🔄 Making request to asi1-agentic...")
15
16response = client.chat.completions.create(
17 model="asi1-agentic",
18 messages=[
19 {"role": "user", "content": "Check latest flights arrival status on Delhi airport."}
20 ],
21 extra_headers={
22 "x-session-id": session_id
23 },
24 temperature=0.7,
25 stream=True
26)
27
28print("📡 Response received, streaming content:\n")
29
30# Handle streaming response safely
31for chunk in response:
32 # Safe check for choices and content
33 if (hasattr(chunk, 'choices') and
34 chunk.choices and
35 len(chunk.choices) > 0 and
36 hasattr(chunk.choices[0], 'delta') and
37 hasattr(chunk.choices[0].delta, 'content') and
38 chunk.choices[0].delta.content):
39
40 print(chunk.choices[0].delta.content, end="")
41
42print("\n\n🏁 Stream completed!")
1# Web Search Integration
2from openai import OpenAI
3
4client = OpenAI(
5 api_key="YOUR_ASI_ONE_API_KEY",
6 base_url="https://api.asi1.ai/v1"
7)
8
9response = client.chat.completions.create(
10 model="asi1-extended",
11 messages=[
12 {"role": "user", "content": "Latest developments in AI research"}
13 ],
14 extra_body={
15 "web_search": True
16 }
17)
18
19print(response.choices[0].message.content)

Understanding the Response Structure

After making a request, your response object includes both standard OpenAI fields and ASI:One-specific fields:

  • choices[0].message.content: The main model response
  • model: The model used
  • usage: Token usage details
  • executable_data: (ASI:One) Agent manifests and tool calls from Agentverse marketplace
  • intermediate_steps: (ASI:One) Multi-step reasoning traces
  • thought: (ASI:One) Model reasoning process
1# Accessing response fields
2print(response.choices[0].message.content) # Main answer
3print(response.model) # Model name
4print(response.usage) # Token usage
5
6# ASI:One specific fields
7
8if hasattr(response, 'executable_data'):
9print(response.executable_data) # Agent calls
10if hasattr(response, 'intermediate_steps'):
11print(response.intermediate_steps) # Reasoning steps
12if hasattr(response, 'thought'):
13print(response.thought) # Model thinking

Model Selection for OpenAI SDK

Choose the right ASI:One model based on your use case:

ModelBest ForOpenAI SDK Usage
asi1-miniFast responses, general chatStandard OpenAI parameters
asi1-fastUltra-low latencyStandard OpenAI parameters
asi1-extendedComplex reasoningStandard OpenAI parameters
asi1-agenticAgent orchestrationRequires x-session-id header
asi1-fast-agenticReal-time agentsRequires x-session-id header
asi1-extended-agenticComplex workflowsRequires x-session-id header
asi1-graphData visualizationStandard OpenAI parameters

Next Steps

Ready to get started with ASI:One’s OpenAI-compatible API? Here’s what to do next:

  1. Get your API key - Sign up and create your ASI:One API key
  2. Try the quickstart - Make your first API call in minutes
  3. Explore agentic models - Discover the power of Agentverse marketplace integration
  4. Learn about tool calling - Extend your applications with custom functions

Need help? Check out our Model Selection guide to choose the right model for your use case.