Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local llama support #55

Closed
wants to merge 5 commits into from

Conversation

bytedisciple
Copy link

Added support for using a locally running instance of a LLAMA model instead of OpenAI apis.

Added 2 new params to aider to enable local llama support.

  1. AIDER_MODEL_TOKENS - used to specify the context length the model will use.
  2. AIDER_TOKENIZER - used to specify which tokenizer should be used. Currently only 'openai' and 'llama' are supported. Defaults to openai.

Tested with TheBloke_wizard-vicuna-13B-SuperHOT-8K-GGML running locally and the following ENV values set.

AIDER_OPENAI_API_BASE=http://127.0.0.1:5001/v1
AIDER_MODEL=TheBloke_wizard-vicuna-13B-SuperHOT-8K-GGML
AIDER_MODEL_TOKENS=2
AIDER_TOKENIZER=llama

@bytedisciple bytedisciple marked this pull request as ready for review July 5, 2023 18:47
@bytedisciple
Copy link
Author

Currently working through a bug where the files created/edited are not named sanely. Given the following prompt:

Write test.py, a command line utility in python which, when run with python test.py, returns the current unix time in seconds to STDOUT. Follow PEP 8 style guidelines.

When using a local llm, the file created is named something like "Sure, here's an example of what you described:", the first line of response from the llm.

It should be creating a file named test.py. However the contents of the file work and are sane.

@paul-gauthier
Copy link
Collaborator

Sounds like the model isn't obeying the edit format from the system prompt? Which edit format are you asking it to use? The "whole" format is easier for models to understand. Which is why aider uses it for gpt-3.5-turbo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants