Configure an OpenAI-compatible endpoint
These settings apply after the privately distributed Local LLM extension is installed in your organization (see Introduction).
Settings in Azure DevOps
After the Local LLM extension is installed and assigned to your project:
- Open Organization settings or Collection settings.
- Find ClearSpecs AI and open the Local LLM tab.
- Set:
- Base URL - HTTP(S) endpoint implementing an OpenAI-compatible chat/completions API (trailing slashes are normalized).
- Model - Model name your server exposes (for example an Ollama model tag).
- API key - Optional; include if your proxy requires it.
Common default for Ollama’s OpenAI bridge: http://127.0.0.1:11434/v1.
Settings are stored in Azure DevOps Extension Data for the extension; access follows Microsoft’s scoping for extension data in your organization.
CORS (browser and Ollama)
The Boards UI runs on an Azure DevOps origin (for example https://dev.azure.com). Calls to a local Ollama instance are cross-origin. The LLM server must return CORS headers that allow your Azure DevOps web origin.
For Ollama, set OLLAMA_ORIGINS (or your version’s documented CORS settings) to include your Azure DevOps origin (scheme + host). Without that, browser fetch requests fail even when Ollama is running locally.
Privacy
This SKU does not call the ClearSpecs cloud assistant API for prompt execution; requests go only to the URL you configure (subject to your network and proxy rules).