LM Studio

A quick guide to setting up LM Studio for local AI model execution with Cline.

🤖 Setting Up LM Studio with Cline

Run AI models locally using LM Studio with Cline.

📋 Prerequisites

  • Windows, macOS, or Linux computer with AVX2 support

  • Cline installed in VS Code

🚀 Setup Steps

1. Install LM Studio

  • Download and install for your operating system

2. Launch LM Studio

  • Open the installed application

  • You'll see four tabs on the left: Chat, Developer (where you will start the server), My Models (where your downloaded models are stored), Discover (add new models)

3. Download a Model

  • Browse the "Discover" page

  • Select and download your preferred model

  • Wait for download to complete

4. Start the Server

  • Navigate to the "Developer" tab

  • Toggle the server switch to "Running"

  • Note: The server will run at http://localhost:1234

5. Configure Cline

  1. Open VS Code

  2. Click Cline settings icon

  3. Select "LM Studio" as API provider

  4. Select your model from the available options

⚠️ Important Notes

  • Start LM Studio before using with Cline

  • Keep LM Studio running in background

  • First model download may take several minutes depending on size

  • Models are stored locally after download

🔧 Troubleshooting

  1. If Cline can't connect to LM Studio:

  2. Verify LM Studio server is running (check Developer tab)

  3. Ensure a model is loaded

  4. Check your system meets hardware requirements

Last updated