Installation¶
Requirements¶
- Python 3.11 or newer
- Access to at least one provider or local OpenAI-compatible endpoint
Install from source¶
git clone https://github.com/edo0xff/llm-expose.git
cd llm-expose
pip install -e .
Install with development tools¶
pip install -e .[dev]
In zsh, quote extras to avoid shell globbing:
pip install -e '.[dev]'
Configure environment variables¶
Set provider keys as environment variables when possible.
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
Optionally override where llm-expose stores config:
export LLM_EXPOSE_CONFIG_DIR="$HOME/.llm-expose"
Default layout under the config directory:
models/for saved model definitionschannels/for saved channel definitionspairs.yamlfor pairing allow-listsmcp_servers.yamlfor MCP server and settings config
Keep bot tokens and API keys out of version control and prefer environment-variable injection in deployment.
Validate CLI¶
llm-expose --help
Next: Quick Start