模型支持企业博客
500+ AI 模型 API,一次搞定,就在 CometAPI
模型 API
开发者
快速入门文档API 仪表板
资源
AI 模型博客企业更新日志关于
2025 CometAPI。保留所有权利。隐私政策服务条款
Home/Models/DeepSeek/DeepSeek-Chat
D

DeepSeek-Chat

输入:$0.216/M
输出:$0.88/M
上下文:64K
最大输出:64K
The most popular and cost-effective DeepSeek-V3 model. 671B full-blood version. This model supports a maximum context length of 64,000 tokens.
商用
Playground
概览
功能亮点
定价
API
版本

What is DeepSeek-Chat?

DeepSeek-Chat refers to DeepSeek’s chat-oriented deployments built on the DeepSeek V3 series (most recently DeepSeek-V3.2 and the higher-performance variant DeepSeek-V3.2-Speciale). These models are “reasoning-first” large language models (LLMs) optimized for long-context reasoning, tool use (agentic workflows), code and math tasks.

Main features and architectural highlights

  • Reasoning-first design & hybrid inference: DeepSeek emphasizes a “think / non-think” dual mode so the same weights can behave as a fast generator or as a deliberative agent that internally composes multi-step plans before calling tools (their marketing calls this “thinking in tool-use”). This is baked into training data and product UX.
  • Long-context and sparse attention: DeepSeek implements a sparse/efficient attention variant (marketed as DeepSeek Sparse Attention / NSA) intended to make 100k+ token windows practical and cheaper to run than dense attention at the same length. This is core to their claim of supporting very large documents/agent histories.

Benchmark performance (selected, reproducible metrics)

Below are representative numbers drawn from the DeepSeek V3 public benchmark tables (Hugging Face / vendor results). When quoting benchmarks, note vendor pages typically control evaluation settings (temperature, prompt settings, output length limits) and evaluate many metrics; the numbers below are representative highlights rather than an exhaustive table.

  • Mathematics:
    • MATH-500 (EM): ~90.2% (DeepSeek-V3 reported).
    • GSM8K: ~89.3% (8-shot math accuracy reported in vendor tables).
  • Code: Code HumanEval (Pass@1): vendor tables show 65.2% (0-shot) in one evaluation table and higher pass rates in integrated chat/code-generation settings (different evaluation variants yield Pass@1 values up to the low-80s when using specialized chat/code configs). (See vendor benchmark pages for the exact evaluation variant.)
  • General reasoning & benchmarks: MMLU / BBH / AGIEval: DeepSeek V3 ranks highly vs. other open-weight models and is reported to be competitive with or approaching frontier closed models on selected reasoning and problem-solving benchmarks in vendor tables. The vendor materials highlight strong wins on math and code categories.

How to access deepseek-chat API

Step 1: Sign Up for API Key

Log in to cometapi.com. If you are not our user yet, please register first. Sign into your CometAPI console. Get the access credential API key of the interface. Click “Add Token” at the API token in the personal center, get the token key: sk-xxxxx and submit.

Step 2: Send Requests to deepseek-chat API

Select the “deepseek-chat\ \” endpoint to send the API request and set the request body. The request method and request body are obtained from our website API doc. Our website also provides Apifox test for your convenience. Replace <YOUR_API_KEY> with your actual CometAPI key from your account. base url is chat format.

Insert your question or request into the content field—this is what the model will respond to . Process the API response to get the generated answer.

Step 3: Retrieve and Verify Results

Process the API response to get the generated answer. After processing, the API responds with the task status and output data.

DeepSeek-Chat 的功能

了解 DeepSeek-Chat 的核心能力,帮助提升性能与可用性,并改善整体体验。

DeepSeek-Chat 的定价

查看 DeepSeek-Chat 的竞争性定价,满足不同预算与使用需求,灵活方案确保随需求扩展。
Comet 价格 (USD / M Tokens)官方定价 (USD / M Tokens)折扣
输入:$0.216/M
输出:$0.88/M
输入:$0.27/M
输出:$1.1/M
-20%

DeepSeek-Chat 的示例代码与 API

获取完整示例代码与 API 资源,简化 DeepSeek-Chat 的集成流程,我们提供逐步指导,助你发挥模型潜能。
Python
JavaScript
Curl
from openai import OpenAI
import os

# Get your CometAPI key from https://api.cometapi.com/console/token, and paste it here
COMETAPI_KEY = os.environ.get("COMETAPI_KEY") or "<YOUR_COMETAPI_KEY>"
BASE_URL = "https://api.cometapi.com/v1"

client = OpenAI(base_url=BASE_URL, api_key=COMETAPI_KEY)

completion = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
)

print(completion.choices[0].message.content)

Python Code Example

from openai import OpenAI
import os

# Get your CometAPI key from https://api.cometapi.com/console/token, and paste it here
COMETAPI_KEY = os.environ.get("COMETAPI_KEY") or "<YOUR_COMETAPI_KEY>"
BASE_URL = "https://api.cometapi.com/v1"

client = OpenAI(base_url=BASE_URL, api_key=COMETAPI_KEY)

completion = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
)

print(completion.choices[0].message.content)

JavaScript Code Example

import OpenAI from "openai";

// Get your CometAPI key from https://api.cometapi.com/console/token, and paste it here
const api_key = process.env.COMETAPI_KEY;
const base_url = "https://api.cometapi.com/v1";

const openai = new OpenAI({
  apiKey: api_key,
  baseURL: base_url,
});

const completion = await openai.chat.completions.create({
  model: "deepseek-chat",
  messages: [
    { role: "system", content: "You are a helpful assistant." },
    { role: "user", content: "Hello!" },
  ],
});

console.log(completion.choices[0].message.content);

Curl Code Example

curl https://api.cometapi.com/v1/chat/completions \
     --header "Authorization: Bearer $COMETAPI_KEY" \
     --header "content-type: application/json" \
     --data \
'{
    "model": "deepseek-chat",
    "messages": [
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
}'

DeepSeek-Chat 的版本

DeepSeek-Chat 可能存在多个快照,原因包括:更新后保持一致性需要保留旧版、给开发者留出迁移窗口,以及全球/区域端点提供的优化差异。具体差异请参考官方文档。
version
deepseek-chat

更多模型