Migrate from OpenAI to Gonka
Switch to Gonka Network with minimal code changes. Your existing OpenAI SDK code works with GonkaGate.
2 Lines of Code. That's It.
Change your base URL and model name — everything else stays the same.
Before (OpenAI)
from openai import OpenAI client = OpenAI(− base_url="https://api.openai.com/v1",− api_key="sk-..." )After (GonkaGate)
from openai import OpenAI client = OpenAI(+ base_url="https://api.gonkagate.com/v1",+ api_key="gp-..." )From OpenAI SDK
The OpenAI SDK works directly with GonkaGate. Just configure the base URL and use a GonkaGate API key.
Python
before.py
from openai import OpenAI
client = OpenAI() # Uses OPENAI_API_KEY env
response = client.chat.completions.create(
model="gpt-5.2",
messages=[{"role": "user", "content": "Hello!"}]
)TypeScript
before.ts
import OpenAI from 'openai';
const client = new OpenAI(); // Uses OPENAI_API_KEY env
const response = await client.chat.completions.create({
model: 'gpt-5.2',
messages: [{ role: 'user', content: 'Hello!' }],
});Go
before.go
package main
import "github.com/sashabaranov/go-openai"
func main() {
client := openai.NewClient("sk-...")
resp, _ := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: "gpt-5.2",
Messages: []openai.ChatCompletionMessage{
{Role: "user", Content: "Hello!"},
},
},
)
}From Anthropic
Anthropic uses a different SDK structure. You'll need to switch to the OpenAI SDK with GonkaGate configuration.
- Replace the Anthropic SDK with the OpenAI SDK
- Configure base_url to point to GonkaGate
- Use Gonka Network models instead of Claude models
before_anthropic.py
from anthropic import Anthropic
client = Anthropic(api_key="sk-ant-...")
message = client.messages.create(
model="claude-opus-4.5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)From Azure OpenAI
Remove Azure-specific configuration. The standard OpenAI SDK works with GonkaGate.
- Replace AzureOpenAI with standard OpenAI client
- Remove api_version and azure_endpoint parameters
- Set base_url to api.gonkagate.com/v1
before_azure.py
from openai import AzureOpenAI
client = AzureOpenAI(
api_version="2025-01-01",
azure_endpoint="https://your-resource.openai.azure.com",
api_key="your-azure-key"
)
response = client.chat.completions.create(
model="gpt-5-deployment",
messages=[{"role": "user", "content": "Hello!"}]
)Compatibility Matrix
GonkaGate supports most OpenAI API features. Here's what's compatible:
Chat Completions
- OpenAI
- Supported
- GonkaGate
- Supported
- Notes
- —
Streaming (SSE)
- OpenAI
- Supported
- GonkaGate
- Supported
- Notes
- —
Function Calling / Tools
- OpenAI
- Supported
- GonkaGate
- Supported
- Notes
- —
JSON Mode
- OpenAI
- Supported
- GonkaGate
- Supported
- Notes
- —
Vision (Image Input)
- OpenAI
- Supported
- GonkaGate
- Partial
- Notes
- Model-dependent
Embeddings
- OpenAI
- Supported
- GonkaGate
- Partial
- Notes
- If model supports
Audio (Whisper/TTS)
- OpenAI
- Supported
- GonkaGate
- Not supported
- Notes
- —
Fine-tuning
- OpenAI
- Supported
- GonkaGate
- Not supported
- Notes
- —
Assistants API
- OpenAI
- Supported
- GonkaGate
- Not supported
- Notes
- —
Batch API
- OpenAI
- Supported
- GonkaGate
- Not supported
- Notes
- —
What's Different
Key differences between OpenAI and GonkaGate:
| Aspect | OpenAI | GonkaGate |
|---|---|---|
| Model ID Format | gpt-5.2, gpt-5.1-mini, gpt-5.1-nano, etc. | provider/model-name (e.g., Qwen/Qwen3-235B) |
| Organization | Required for some accounts | Not needed |
| Rate Limits | Tier-based limits | Pay as you go, no tiers |
| API Key Format | sk-... | gp-... |