LM Studio
Download LM Studio: https://lmstudio.ai/
Config our model (maybe increase the context length, GPU offloading, flash attention, etc) and load it:
Once its done loading the model successfully you should see something like this (copy down the loaded model name, we need it later):
We're all done with the LM Studio side of things.
VsCode
Go to Extensions and install the "continue" extension:
Add an entry for the OpenAi model we loaded in LM Studio (im loading the bigger 120b version here), and any roles you want it to have:
(snippet):
- name: OpenAi (LM Studio)provider: lmstudiomodel: openai/gpt-oss-120bapiBase: http://192.168.1.254:1234/v1roles:- chat- edit- apply- autocomplete- embed
Save the "config.yaml" file (ctrl+s)
Go to "Models" and select OpenAi as the option for which features you want to enable with it (and additional options if you want):
Now that you've configured everything lets give it a try. Click on the bottom right "Continue" button again and click "Open Chat". A new window should come up for you to prompt in. Make sure the model selected is our LM Studio hosted model:
Let see if it can generate us a function in python for calculating the area of a triangle:
And it should work like so:
No comments:
Post a Comment