LiteLLM & Cline (using Codestral)
Using LiteLLM with Cline
This guide demonstrates how to run a demo for LiteLLM starting with the Codestral model for use with Cline.
Prerequisites
Docker CLI or Docker Desktop installed to run the LiteLLM image locally
For this example config: A Codestral API Key (different from the Mistral API Keys)
Setup
Create a
.env
file and fill in the appropriate field
Copy
# Tip: Use the following command to generate a random alphanumeric key:
# openssl rand -base64 32 | tr -dc 'A-Za-z0-9' | head -c 32
LITELLM_MASTER_KEY=YOUR_LITELLM_MASTER_KEY
CODESTRAL_API_KEY=YOUR_CODESTRAL_API_KEY
Note: Although this is limited to localhost, it's a good practice set LITELLM_MASTER_KEY to something secure
Configuration
We'll need to create a config.yaml
file to contain our LiteLLM configuration. In this case we'll just have one model, 'codestral-latest' and label it 'codestral'
Copy
model_list:
- model_name: codestral
litellm_params:
model: codestral/codestral-latest
api_key: os.environ/CODESTRAL_API_KEY
Running the Demo
Startup the LiteLLM docker container
Copy
docker run \
--env-file .env \
-v $(pwd)/config.yaml:/app/config.yaml \
-p 127.0.0.1:4000:4000 \
ghcr.io/berriai/litellm:main-latest \
--config /app/config.yaml --detailed_debug
Setup Cline
Once the LiteLLM server is up and running you can set it up in Cline:
Base URL should be
http://0.0.0.0:4000/v1
API Key should be the one you set in
.env
for LITELLM_MASTER_KEYModel ID is
codestral
or whatever you named it underconfig.yaml
Getting Help
Author: mdp
Last updated