Skip to content

A Ollama VSCode extension for chatting with LLMs offline.

Notifications You must be signed in to change notification settings

rockarts/stevor

Repository files navigation

Stevor Extension for VS Code

This extension allows you to interact with language models via the Ollama API, providing a rich chat interface within Visual Studio Code.

Features

  • Chat with LLMs: Engage in real-time conversations with various language models supported by Ollama.

  • Model Selection: Choose from a list of installed models to tailor your conversation experience.

  • Syntax Highlighting: Utilizes highlight.js for syntax highlighting on code blocks within the chat response.

  • User-Friendly Interface: A simple and intuitive interface for entering prompts and viewing responses.

Installation

Open Visual Studio Code.

Go to the Extensions view by clicking on the Extensions icon in the Activity Bar on the side of the window or press Ctrl+Shift+X.

Search for "Stevor".

Click Install and then Enable.

Usage

Once installed, you can activate the extension by opening the Command Palette (Ctrl+Shift+P).

Type "LLM Chat" and select it from the list.

A new panel will open where you can select a model and enter your prompts.

Click the "Send" button to receive responses from the selected language model.

Configuration

Models: The extension automatically lists all installed models provided by Ollama. You can select any of these models for your conversations. Make sure Ollama is running on your system.

Prompts: Enter your prompt in the text area and click "Send" to get a response.

Known Issues

If you encounter any issues or have feedback, please report them on the GitHub repository.

  • Switching to a new tab will erase your chat.

Release Notes

1.0.0

Initial release of the Stevor extension for VS Code, supporting basic chat functionality with Ollama language models.

Thank you for using Stevor! We hope this extension enhances your coding experience by providing a seamless interface to interact with AI language models.

Enjoy!

About

A Ollama VSCode extension for chatting with LLMs offline.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published