ChatGetting Started
Quick Start
Quick start guides for different Soom Chat deployment options
Quick Start
Choose your preferred deployment method to get started with Soom Chat quickly.
Deployment Options
Local AI Models
- Starting with Ollama - Use local AI models with Ollama
- Starting with Llama.cpp - Direct Llama.cpp integration
Cloud AI Services
- Starting with OpenAI - OpenAI API integration
- Starting with OpenAI-Compatible Servers - Compatible server setup
Advanced Features
- Getting Started with Functions - Function integration
Prerequisites
Before you begin, ensure you have:
- Docker installed on your system
- Basic understanding of environment variables
- Access to your preferred AI model provider (if using cloud services)
Getting Help
If you encounter any issues during setup, check our Troubleshooting section or FAQ.
How is this guide?