morphik.toml
configuration file to control all aspects of the system.
Model Configuration
Morphik uses LiteLLM to route to 100+ LLM providers with a unified interface. This means you can use models from OpenAI, Anthropic, Google, AWS Bedrock, Azure, Hugging Face, and many more - all with the same simple configuration format.Example Configurations
In yourmorphik.toml
, define models in the LiteLLM format:
Local LLMs
Morphik can also run entirely with local LLMs. We directly integrate with two major local LLM servers:Ollama

🍋 Lemonade
Lemonade Server - Optimized local LLM server for AMD GPUs and NPUs.Docker Deployments
When running Morphik in Docker:- Local services: Use
http://host.docker.internal:PORT
- Both in Docker: Use container names (e.g.,
http://ollama:11434
)
Need Help?
- Join our Discord community
- Check GitHub for issues