跳转到主要内容

新版本接口

获取模型信息平台

接口地址GET https://aihubmix.com/api/v1/models 功能描述:获取所有可用模型的详细信息。

模型对象字段说明

data
array
模型信息列表数组
model_id
string
模型唯一标识符
desc
string
模型功能描述(英文)
types
string
模型类型,支持值:llm(大语言模型)、image_generation(图片生成模型)、video(视频生成模型)、tts(语音合成模型)、stt(语音转文本模型)、embedding(嵌入模型)、rerank(排序模型)
features
string
支持的功能特性,支持值:thinking(支持思考推理)、tools(支持工具调用)、function_calling(支持函数调用)、web(支持搜索)、deepsearch(支持深度搜索)、long_context(长上下文模型)、structured_outputs(结构化输出)
input_modalities
string
支持的输入模态,支持值:text(文本)、image(图像)、audio(音频) 、video(视频)、pdf
max_output
string
最大输出Token数量
context_length
string
上下文窗口大小(最大输入Token数量)
pricing
object
价格信息对象
pricing.input
number
输入Token价格(每1K Token,美元)
pricing.output
number
输出Token价格(每1K Token,美元)
pricing.cache_read
number
缓存读取价格(每1K Token,美元,可选字段)
pricing.cache_write
number
缓存写入价格(每1K Token,美元,可选字段)

请求示例

import requests

# 接口地址
url = "https://aihubmix.com/api/v1/models"

response = requests.get(url)
print(response.json())

params = {
    "type": "llm",                   
	"modalities": "text",
	"model": "gpt-5",
	"features": "thinking",
    "sort_by": "context_length",
    "sort_order": "desc"    
}
response = requests.get(url, params=params)
print(response.json())

请求参数说明(可用于筛选)

type
string
模型类型。支持值:llm(大语言模型)、image_generation(图片生成模型)、video(视频生成模型)、tts(语音合成模型)、stt(语音转文本模型)、embedding(嵌入模型)、rerank(排序模型)
modalities
string
输入模态。支持值:text(文本)、image(图像)、audio (音频)、video(视频)、pdf,支持多模态查询(逗号分隔)
model
string
模型名称模糊搜索(支持部分匹配)
features
string
模型功能特性。支持值:thinking(支持思考推理)、tools(支持工具调用)、function_calling(支持函数调用)、web(支持搜索)、deepsearch(支持深度搜索)、long_context(长上下文模型)、structured_outputs(结构化输出),支持多功能查询(逗号分隔)
sort_by
string
排序字段。支持值:
model_ratio:按性价比排序
context_length:按上下文长度排序
coding:编程模型优先排序
order:按默认顺序排序
sort_order
string
排序方向。支持值:
asc(升序)
desc(降序)

响应成功示例

{
    "data": [
        {
            "model_id": "gpt-5",
            "desc": "GPT-5 is OpenAI flagship model for coding, reasoning, and agentic tasks across domains.",
            "pricing": {
                "cache_read": 0.125,
                "input": 1.25,
                "output": 10
            },
            "types": "llm",
            "features": "thinking,tools,function_calling,structured_outputs",
            "input_modalities": "text,image",
            "max_output": 128000,
            "context_length": 400000
        },
        {
            "model_id": "gpt-5-codex",
            "desc": "GPT-5-Codex is a version of GPT-5 optimized for autonomous coding tasks in Codex or similar environments. It is only available in the Responses API, and the underlying model snapshots will be updated regularly. https://docs.aihubmix.com/en/api/Responses-API You can also use it in codex-cll; see https://docs.aihubmix.com/en/api/Codex-CLI for using codex-cll through Aihubmix.",
            "pricing": {
                "cache_read": 0.125,
                "input": 1.25,
                "output": 10
            },
            "types": "llm",
            "features": "thinking,tools,function_calling,structured_outputs",
            "input_modalities": "text,image",
            "max_output": 128000,
            "context_length": 400000
        },
        {
            "model_id": "gpt-5-mini",
            "desc": "GPT-5 mini is a faster, more cost-efficient version of GPT-5. It's great for well-defined tasks and precise prompts.",
            "pricing": {
                "cache_read": 0.025,
                "input": 0.25,
                "output": 2
            },
            "types": "llm",
            "features": "thinking,tools,function_calling,structured_outputs",
            "input_modalities": "text,image",
            "max_output": 128000,
            "context_length": 400000
        },
        {
            "model_id": "gpt-5-nano",
            "desc": "GPT-5 Nano is our fastest, cheapest version of GPT-5. It's great for summarization and classification tasks.",
            "pricing": {
                "cache_read": 0.005,
                "input": 0.05,
                "output": 0.4
            },
            "types": "llm",
            "features": "thinking,tools,function_calling,structured_outputs",
            "input_modalities": "text,image",
            "max_output": 128000,
            "context_length": 400000
        },
        {
            "model_id": "gpt-5-pro",
            "desc": "GPT-5 pro uses more compute to think harder and provide consistently better answers.\n\nGPT-5 pro is available in the Responses API only to enable support for multi-turn model interactions before responding to API requests, and other advanced API features in the future. Since GPT-5 pro is designed to tackle tough problems, some requests may take several minutes to finish. To avoid timeouts, try using background mode. As our most advanced reasoning model, GPT-5 pro defaults to (and only supports) reasoning.effort: high. GPT-5 pro does not support code interpreter.",
            "pricing": {
                "input": 15,
                "output": 120
            },
            "types": "llm",
            "features": "thinking,tools,function_calling,structured_outputs",
            "input_modalities": "text,image",
            "max_output": 128000,
            "context_length": 400000
        }
    ],
    "message": "",
    "success": true
}

使用场景示例

GET https://aihubmix.com/api/v1/models?type=llm
说明:在使用编程模型智能排序时,系统会优先展示包含 coding 标签的模型,其他模型按默认顺序排列。

性能优化

缓存机制

  • 缓存策略:HTTP缓存,缓存时长300秒(5分钟)
  • 缓存控制Cache-Control: public, max-age=300, stale-while-revalidate=300
  • 内容验证:支持ETag内容哈希验证

缓存使用示例

# 使用ETag进行条件请求
curl -H "If-None-Match: \"abc123...\"" \
     https://aihubmix.com/api/v1/models
如果内容未更新,服务器返回 304 Not Modified 状态码。

错误处理

{
  "success": false,
  "message": "请求参数格式错误"
}

重要说明

  1. 数据完整性:此接口返回所有符合条件的模型,不进行分页处理
  2. 类型兼容性:支持新旧类型标识的自动映射
    • t2tllm
    • t2iimage_generation
    • t2vvideo
    • rerankingrerank
  3. 筛选逻辑:多个筛选条件之间为逻辑与(AND)关系
  4. 排序规则:未指定排序方式时,默认按系统预设顺序排列

旧版本接口

⚠️ 注意:以下为旧版本接口,建议优先使用新版本接口以获得更好的性能和功能体验。

获取模型列表

端点(Endpoint): GET /v1/models
  • 有用户登录获取用户分组下的可用列表,无用户登录获取 default 分组下的可用列表。
  • header 中有 Authorization 字段则查询 key 对应的 token 下配置的模型列表。
返回示例:
{
  "data": [
    {
      "id": "gpt-4o-mini",
      "object": "model",
      "created": 1626777600,
      "owned_by": "OpenAI",
      "permission": [
        {
          "id": "modelperm-LwHkVFn8AcMItP432fKKDIKJ",
          "object": "model_permission",
          "created": 1626777600,
          "allow_create_engine": true,
          "allow_sampling": true,
          "allow_logprobs": true,
          "allow_search_indices": false,
          "allow_view": true,
          "allow_fine_tuning": false,
          "organization": "*",
          "group": null,
          "is_blocking": false
        }
      ],
      "root": "gpt-4o-mini",
      "parent": null
    }
  ]
}

返回结果

状态码状态码含义说明数据模型
200OKnoneInline

返回数据结构

状态码 200
名称类型必选约束中文名说明
» data[object]truenonenone
»» idstringtruenone模型 IDnone
»» objectstringtruenonemodelnone
»» createdintegertruenone创建时间none
»» owned_bystringtruenone开发者none
»» permission[object]¦nulltruenonenone
»»» idstringtruenonenone
»»» objectstringtruenonenone
»»» createdintegertruenonenone
»»» allow_create_enginebooleantruenonenone
»»» allow_samplingbooleantruenonenone
»»» allow_logprobsbooleantruenonenone
»»» allow_search_indicesbooleantruenonenone
»»» allow_viewbooleantruenonenone
»»» allow_fine_tuningbooleantruenonenone
»»» organizationstringtruenonenone
»»» groupnulltruenonenone
»»» is_blockingbooleantruenonenone
»» rootstringtruenone模型名称none
»» parentnulltruenone父节点none

获取模型信息

端点(Endpoint):GET /v1/models/:model

请求参数

名称位置类型必选说明
modelpathstring模型 ID
返回示例:
200 Response
{
  "id": "string",
  "object": "string",
  "created": 0,
  "owned_by": "string",
  "permission": [
    {
      "id": "string",
      "object": "string",
      "created": 0,
      "allow_create_engine": true,
      "allow_sampling": true,
      "allow_logprobs": true,
      "allow_search_indices": true,
      "allow_view": true,
      "allow_fine_tuning": true,
      "organization": "string",
      "group": null,
      "is_blocking": true
    }
  ],
  "root": "string",
  "parent": null
}

返回结果

状态码状态码含义说明数据模型
200OKnoneInline

返回数据结构

状态码 200
名称类型必选约束中文名说明
idstringtruenone模型 IDnone
objectstringtruenonemodelnone
createdintegertruenone创建时间none
owned_bystringtruenone开发者none
permission[object]truenonenone
» idstringfalsenonenone
» objectstringfalsenonenone
» createdintegerfalsenonenone
» allow_create_enginebooleanfalsenonenone
» allow_samplingbooleanfalsenonenone
» allow_logprobsbooleanfalsenonenone
» allow_search_indicesbooleanfalsenonenone
» allow_viewbooleanfalsenonenone
» allow_fine_tuningbooleanfalsenonenone
» organizationstringfalsenonenone
» groupnullfalsenonenone
» is_blockingbooleanfalsenonenone
rootstringtruenone模型名称none
parentnulltruenone父节点none