- π¬ Build for fast and production usages
- π Support llama3, qwen2, gemma, etc, and many quantized versions full list
- βοΈ OpenAI-compatible API
- π¬Β Built-in ChatGPT like UI
- π₯ Accelerated LLM decoding with state-of-the-art inference backends
- π₯οΈ Ready for enterprise-grade cloud deployment (Kubernetes, Docker and BentoCloud)
Installation and Setup
Install the OpenLLM package via PyPI:LLM
OpenLLM supports a wide range of open-source LLMs as well as serving usersβ own fine-tuned LLMs. Useopenllm model command to see all available models that
are pre-optimized for OpenLLM.