LogoMkSaaS Docs

Chat Completions API

OpenAI-compatible interface for Ling-1T chat.

Chat Completions API

The chat completions endpoint mirrors OpenAI’s v1/chat/completions contract. Use it to stream or fetch Ling-1T responses with minimal code changes.

  • Endpoint: POST https://ling-1t.ai/api/v1/chat/completions
  • Auth: Authorization: Bearer <api-key>
  • Model: inclusionai/ling-1t

Request Example (JSON)

POST /api/v1/chat/completions HTTP/1.1
Host: ling-1t.ai
Authorization: Bearer LING1T-...
Content-Type: application/json
 
{
  "model": "inclusionai/ling-1t",
  "messages": [
    { "role": "system", "content": "You are a precise financial analyst." },
    { "role": "user", "content": "Summarize Q4 revenue trends for APAC." }
  ],
  "temperature": 0.4,
  "max_tokens": 512,
  "stream": false
}

Node.js (TypeScript)

import fetch from 'node-fetch';
 
const response = await fetch('https://ling-1t.ai/api/v1/chat/completions', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    Authorization: `Bearer ${process.env.LING1T_API_KEY}`,
  },
  body: JSON.stringify({
    model: 'inclusionai/ling-1t',
    messages: [
      { role: 'system', content: 'You are a precise financial analyst.' },
      { role: 'user', content: 'Summarize Q4 revenue trends for APAC.' },
    ],
  }),
});
 
const data = await response.json();
console.log(data.choices[0].message?.content);

Python

import requests
 
headers = {
    "Authorization": f"Bearer {API_KEY}",
    "Content-Type": "application/json",
}
 
payload = {
    "model": "inclusionai/ling-1t",
    "messages": [
        {"role": "system", "content": "You are a precise financial analyst."},
        {"role": "user", "content": "Summarize Q4 revenue trends for APAC."}
    ],
    "temperature": 0.4,
    "max_tokens": 512
}
 
resp = requests.post("https://ling-1t.ai/api/v1/chat/completions", json=payload, headers=headers)
resp.raise_for_status()
print(resp.json()["choices"][0]["message"]["content"])

Streaming Responses

Set stream: true to receive Server-Sent Events (SSE). The data format matches OpenAI’s, enabling drop-in use of existing clients.

data: {"id":"chatcmpl-...","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"content":"你好"}}],"model":"inclusionai/ling-1t"}
 
...
data: [DONE]

Usage Metrics

Responses include token usage in the OpenAI schema (usage.prompt_tokens, usage.completion_tokens). These values feed billing and are visible in the dashboard usage explorer.

Table of Contents