Thursday, September 11, 2025

VSCode + LMStudio via Continue

LM Studio

Download LM Studio: https://lmstudio.ai/


Search for a Model:

Download a Model (lets use OpenAi's open source model):

Enable local api and host the model:

Config our model (maybe increase the context length, GPU offloading, flash attention, etc) and load it:


Once its done loading the model successfully you should see something like this (copy down the loaded model name, we need it later):


We're all done with the LM Studio side of things.

VsCode

Download and install VsCode: https://code.visualstudio.com/


Go to Extensions and install the "continue" extension:


Click on the "Continue" button at the bottom right of the window and then click "Open Settings":


Click on "Agents" then "Local Agent", a "config.yaml" will open in the editor:



Add an entry for the OpenAi model we loaded in LM Studio (im loading the bigger 120b version here), and any roles you want it to have:


(snippet):
  - name: OpenAi (LM Studio)
    provider: lmstudio
    model: openai/gpt-oss-120b
    apiBase: http://192.168.1.254:1234/v1
    roles:
      - chat
      - edit
      - apply
      - autocomplete
      - embed

Save the "config.yaml" file (ctrl+s)

Go to "Models" and select OpenAi as the option for which features you want to enable with it (and additional options if you want):


Now that you've configured everything lets give it a try. Click on the bottom right "Continue" button again and click "Open Chat". A new window should come up for you to prompt in. Make sure the model selected is our LM Studio hosted model:



Let see if it can generate us a function in python for calculating the area of a triangle:


And it should work like so: