Models & Providers
Local Models
Run models locally with Ollama, LM Studio, and more.
Run models locally with Ollama, LM Studio, and more.
Overview
This section covers everything you need to know about local models in Overseer.
Note
This documentation page is being expanded. Check the sidebar for related topics.
Getting Started
To get started with local models, make sure you have completed the Quickstart guide.
Configuration
Configuration options will be documented here. For now, refer to the application settings.
Examples
example.yaml
# Example configuration
name: example
type: models
settings:
enabled: trueNext Steps
Explore more documentation in the sidebar, or head back to the documentation home.