Class OllamaProvider

java.lang.Object
io.forgeai.jenkins.llm.LLMProvider
io.forgeai.jenkins.llm.OllamaProvider
All Implemented Interfaces:
ExtensionPoint, Describable<LLMProvider>, Serializable

public class OllamaProvider extends LLMProvider
Ollama provider for air-gapped / local LLM inference. Recommended models: codellama, deepseek-coder, mistral, llama3, phi3.
See Also: