Package io.forgeai.jenkins.llm
package io.forgeai.jenkins.llm
-
ClassDescriptionExtension point for LLM backends.Deprecated.Providers are now configured directly in ForgeAIGlobalConfiguration.Ollama provider for air-gapped / local LLM inference.OpenAI-compatible provider.