Skip to main content
Supports text generation calls via Anthropic and OpenAI compatible interfaces.
If some functions are not mentioned in this document, please refer to OpenAI official documentation

Message body structure description

Message TypeFunction DescriptionSample Content
systemModel instructions, setting the AI ​​role, describing how the model should generally behave and respondFor example: “You are a pediatrician with 10 years of experience”
userUser input that passes end-user messages to the modelFor example: “What should I do if a young child has a persistent low-grade fever?“
assistantHistorical responses generated by the model, providing the model with examples of how it should respond to the current requestFor example: “It is recommended to take the temperature first…”
If you want the model to respond according to hierarchical instructions, you can use message roles to improve the quality of the output. However, the message role does not always have a deterministic performance, so it is recommended to try multiple usages, compare the effects of different methods, and find the solution that works best for you.

basic conversation

from openai import OpenAI

client = OpenAI(
base_url="https://BASE_URL/v1",
api_key="", # Replace with your API Key on this site
  )

response = client.chat.completions.create(
model="deepseek-r1",
messages=[
          {
"role": "user",
"content": "say 1"
          }
      ]
  )

print(response.choices[0].message.content)

Streaming response

from openai import OpenAI

client = OpenAI(
base_url="https://BASE_URL/v1",
api_key="", # Replace with your API Key on this site
  )

stream = client.chat.completions.create(
model="deepseek-r1",
messages=[
          {
"role": "user",
"content": "Write a poem about spring"
          }
      ],
stream=True
  )

for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)

Function Calling

For more details refer to OpenAI Function Calling Guide
from openai import OpenAI

client = OpenAI(
base_url="https://BASE_URL/v1",
api_key="", # Replace with your API Key on this site
  )

tools = [
      {
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather information for a specified city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "city name"
                      }
                  },
"required": ["city"]
              }
          }
      }
  ]

response = client.chat.completions.create(
model="deepseek-r1",
messages=[
          {
"role": "user",
"content": "How is the weather in Beijing today?"
          }
      ],
tools=tools
  )

print(response.choices[0].message)

Response interface

Things to note

If you encounter any problems during use:
  • Contact our technical support team
  • in our Ticket Center Submit ticket feedback
  1. When using, you need toOPENAI_BASE_URLset tohttps://BASE_URL/v1
  2. OPENAI_API_KEYshould be set to your API Key
  3. temperatureThe parameter value range is [0.0, 1.0]. It is recommended to use 1.0. If it exceeds the range, an error will be returned.
  4. Some OpenAI parameters (such as presence_penalty, frequency_penalty, logit_bias etc.) will be ignored
  5. old versionfunction_callDeprecated, please usetoolsparameter

Anthropic compatible interface

If some functions are not mentioned in this document, please refer to Anthropic official documentation

basic conversation

import anthropic

client = anthropic.Anthropic(
base_url="https://BASE_URL/anthropic",
api_key="" # Replace with your API Key on this site
  )

message = client.messages.create(
model="deepseek-r1",
max_tokens=1024,
messages=[
          {
"role": "user",
"content": "Please explain in simple language what machine learning is"
          }
      ]
  )

print(message.content[0].text)

Streaming response

import anthropic

client = anthropic.Anthropic(
base_url="https://BASE_URL/anthropic",
api_key="" # Replace with your API Key on this site
  )

with client.messages.stream(
model="deepseek-r1",
max_tokens=1024,
messages=[
          {
"role": "user",
"content": "Write a poem about spring"
          }
      ]
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)

Function Calling

For more details refer to Anthropic Function Calling Guide
import anthropic

client = anthropic.Anthropic(
base_url="https://BASE_URL/anthropic",
api_key="" # Replace with your API Key on this site
  )

tools = [
      {
"name": "get_weather",
"description": "Get weather information for a specified city",
"input_schema": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "city name"
                  }
              },
"required": ["city"]
          }
      }
  ]

message = client.messages.create(
model="deepseek-r1",
max_tokens=1024,
tools=tools,
messages=[
          {
"role": "user",
"content": "How is the weather in Beijing today?"
          }
      ]
  )

print(message.content)

Things to note

  1. When using, you need toANTHROPIC_BASE_URLset tohttps://BASE_URL/v1
  2. ANTHROPIC_API_KEYshould be set to your API Key