Explore GLM-4.7’s Playground — an interactive environment to test models, run queries in real time. Try prompts, adjust parameters, and iterate instantly to accelerate development and validate use cases.
GLM-4.7 is Z.ai / Zhipu AI’s latest flagship open-foundation large language model (model name glm-4.7). It is positioned as a developer-oriented “thinking” model with particular improvements in coding/agentic task execution, multi-step reasoning, tool invocation, and long-context workflows. The release emphasizes large context handling (up to 200K context), high maximum output (up to 128K tokens), and specialized “thinking” modes for agentic pipelines.
GLM-4.7’s publisher/maintainers and community benchmark tables report substantial gains vs GLM-4.6 and competitive results against other contemporary models on coding, agentic and tool usage tasks. Selected numbers (source: official Hugging Face / Z.AI published tables):
Log in to cometapi.com. If you are not our user yet, please register first. Sign into your CometAPI console. Get the access credential API key of the interface. Click “Add Token” at the API token in the personal center, get the token key: sk-xxxxx and submit.
Select the “glm-4.7” endpoint to send the API request and set the request body. The request method and request body are obtained from our website API doc. Our website also provides Apifox test for your convenience. Replace <YOUR_API_KEY> with your actual CometAPI key from your account. Where to call it: Chat-style APIs.
Insert your question or request into the content field—this is what the model will respond to . Process the API response to get the generated answer.
Process the API response to get the generated answer. After processing, the API responds with the task status and
| Comet Price (USD / M Tokens) | Official Price (USD / M Tokens) |
|---|---|
Input:$0.44/M Output:$1.78/M | Input:$0.56/M Output:$2.22/M |
from openai import OpenAI
import os
# Get your CometAPI key from https://api.cometapi.com/console/token
COMETAPI_KEY = os.environ.get("COMETAPI_KEY") or "<YOUR_COMETAPI_KEY>"
BASE_URL = "https://api.cometapi.com/v1"
client = OpenAI(base_url=BASE_URL, api_key=COMETAPI_KEY)
# glm-4.7: Zhipu GLM-4.7 model via chat/completions
completion = client.chat.completions.create(
model="glm-4.7",
messages=[
{"role": "user", "content": "Hello! Tell me a short joke."}
]
)
print(completion.choices[0].message.content)