AI Features & LLM Integration
Zammad 7 introduces a full suite of AI capabilities — automatic ticket summaries, AI-assisted replies, auto-generated ticket titles, and more. See our blog post and the official release notes for details.
To use AI features, connect Zammad to a Large Language Model (LLM). Several options are available: Zammad AI (operated by the Zammad team), major commercial providers — OpenAI (ChatGPT), Anthropic (Claude), Azure AI, and Mistral (EU-based) — or any OpenAI API-compatible provider, including Ollama for fully self-hosted models.
All you typically need is an API token and, in some cases, an API URL. For ChatGPT, Gemini, and similar services, refer to each provider’s documentation to generate your token.
Connecting a self-hosted LLM is straightforward. The only requirement: its API must be publicly accessible over the internet (protected by authentication). In Zammad, add your LLM under Settings → AI → Provider — enter the API URL, token, and model name. AI features are available immediately, no further configuration needed.

NoteRunning an LLM is resource-intensive. GPUs with at least 24 GB VRAM are typically required. Smaller models need fewer resources but generally deliver lower-quality results.
Questions about running or connecting your own LLM? Reach out at support@server.camp.
Rather than running your own LLM or subscribing to OpenAI, Anthropic, or similar services, you can add a pre-configured LLM directly during checkout — it’s set up automatically with no extra steps.

Three open-source models are available:
- GPT-OSS 120b (by OpenAI) — our recommendation, best results in practice
- Mistral Small 3.2 24b (by Mistral) — the European alternative
- Llama 3.3 70b (by Meta) — another highly capable model
We’re actively evaluating all three and welcome your feedback.
All models run at Scaleway in Paris, France — GDPR-compliant hosting in the EU.
Beta phaseserver.camp LLM integration has been in free beta since March 10, 2026, and is available on the Business and Corporate plans. After the evaluation phase, it will become a paid add-on. You’ll receive advance notice before any changes take effect.