[SNIPPET]Mar 16 20262 min read

$Run GLM Models Through Claude Code with a Single Command

A one-line shell function to run Zhipu AI's GLM models through Claude Code's CLI — keep the TUI you love, swap the model underneath.

If you've been wanting to try out Zhipu AI's GLM models but don't want to leave the comfort of Claude Code's interface, here's a dead-simple setup that gets you there in one line.

The One-Liner

Paste this into your terminal and you're done:

echo 'glm(){ ANTHROPIC_BASE_URL="https://api.z.ai/api/anthropic" ANTHROPIC_AUTH_TOKEN="YOUR_API_KEY" ANTHROPIC_DEFAULT_SONNET_MODEL="glm-4.7" ANTHROPIC_DEFAULT_OPUS_MODEL="glm-4.7" ANTHROPIC_DEFAULT_HAIKU_MODEL="glm-4.5-air" API_TIMEOUT_MS="3000000" claude "$@"; }' >> ~/.zshrc && source ~/.zshrc

Replace YOUR_API_KEY with your actual Zhipu AI API key.

What This Does

It creates a shell function called glm in your .zshrc that wraps the claude CLI with a few overridden environment variables:

  • ANTHROPIC_BASE_URL — Points requests to Zhipu AI's Anthropic-compatible endpoint instead of Anthropic's servers.
  • ANTHROPIC_AUTH_TOKEN — Your Zhipu API key for authentication.
  • ANTHROPIC_DEFAULT_SONNET_MODEL / ANTHROPIC_DEFAULT_OPUS_MODEL — Both mapped to glm-4.7, their flagship model.
  • ANTHROPIC_DEFAULT_HAIKU_MODEL — Mapped to glm-4.5-air, the lighter/faster option.
  • API_TIMEOUT_MS — Set to 3,000,000ms (50 minutes) because GLM responses can take longer than Anthropic's defaults expect.
  • "$@" — Passes through any flags you give to the glm command directly to claude.

Usage

Once sourced, just use glm anywhere you'd use claude:

# Start an interactive session
glm

# Run with a direct prompt
glm -p "explain this codebase"

# Everything else works the same
glm --resume

Your original claude command stays completely untouched — it still hits Anthropic's API as usual. The glm function is a separate entry point.

Why This Is Nice

Claude Code's TUI is genuinely great — the conversation flow, /compact, session resumption, the tool use loop. This lets you keep all of that while swapping the brain underneath. Useful for comparing model outputs on the same codebase or just experimenting with GLM without context-switching to a different tool.

Notes

  • You'll need a Zhipu AI API key from z.ai.
  • If you're on bash instead of zsh, replace ~/.zshrc with ~/.bashrc.
  • The long timeout is intentional — GLM can be slower on complex agentic loops. Adjust if needed.

That's it. One line, no config files, no wrappers. Just a shell function doing shell function things.

#claude-code#glm#zhipu-ai#shell#developer-tools#llm#cli#zsh
Run GLM Models Through Claude Code with a Single Command | Terminal Log