Offline (Local LLM)
The ClearSpecs AI - Local LLM SKU sends prompts to your OpenAI-compatible HTTP API instead of the hosted ClearSpecs cloud assistant. It fits air-gapped environments, bring-your-own-model policies, or teams that already run Ollama or similar servers.
Distribution: The offline (Local LLM) package is distributed privately - it is not published on the public Visual Studio Marketplace. If you are interested in installing the offline extension, please contact us at [email protected].
Next steps
| Topic | Description |
|---|---|
| Configure an OpenAI-compatible endpoint | Base URL, model, API key, and CORS for Ollama or another OpenAI-compatible server. |
| How to enable in XML process templates | WebLayout XML, witadmin import, and extension contributions on Azure DevOps Server. |