Using a custom LLM provider
phospho preview can be ran using any OpenAI compatible LLM provider. The most common ones include:
- Mistral AI (https://mistral.ai/)
- Ollama (https://ollama.com/)
- vLLM (https://docs.vllm.ai/)
- and many others
phospho preview can be ran using any OpenAI compatible LLM provider. The most common ones include: