This extension allows you to interact with language models via the Ollama API, providing a rich chat interface within Visual Studio Code.
-
Chat with LLMs: Engage in real-time conversations with various language models supported by Ollama.
-
Model Selection: Choose from a list of installed models to tailor your conversation experience.
-
Syntax Highlighting: Utilizes highlight.js for syntax highlighting on code blocks within the chat response.
-
User-Friendly Interface: A simple and intuitive interface for entering prompts and viewing responses.
Open Visual Studio Code.
Go to the Extensions view by clicking on the Extensions icon in the Activity Bar on the side of the window or press Ctrl+Shift+X.
Search for "Stevor".
Click Install and then Enable.
Once installed, you can activate the extension by opening the Command Palette (Ctrl+Shift+P).
Type "LLM Chat" and select it from the list.
A new panel will open where you can select a model and enter your prompts.
Click the "Send" button to receive responses from the selected language model.
Models: The extension automatically lists all installed models provided by Ollama. You can select any of these models for your conversations. Make sure Ollama is running on your system.
Prompts: Enter your prompt in the text area and click "Send" to get a response.
If you encounter any issues or have feedback, please report them on the GitHub repository.
- Switching to a new tab will erase your chat.
1.0.0
Initial release of the Stevor extension for VS Code, supporting basic chat functionality with Ollama language models.
Thank you for using Stevor! We hope this extension enhances your coding experience by providing a seamless interface to interact with AI language models.
Enjoy!