From c5e127d6facf0b5070117cb5d8a75b17e1dbe061 Mon Sep 17 00:00:00 2001 From: Paul Gauthier Date: Wed, 5 Jun 2024 20:50:55 -0700 Subject: [PATCH] Broke apart llms --- README.md | 5 +- website/docs/llms.md | 388 +--------------------------- website/docs/llms/anthropic.md | 34 +++ website/docs/llms/azure.md | 27 ++ website/docs/llms/cohere.md | 25 ++ website/docs/llms/deepseek.md | 24 ++ website/docs/llms/editing-format.md | 22 ++ website/docs/llms/gemini.md | 25 ++ website/docs/llms/groq.md | 27 ++ website/docs/llms/ollama.md | 43 +++ website/docs/llms/openai-compat.md | 27 ++ website/docs/llms/openai.md | 37 +++ website/docs/llms/openrouter.md | 35 +++ website/docs/llms/other.md | 39 +++ website/docs/llms/warnings.md | 70 +++++ 15 files changed, 444 insertions(+), 384 deletions(-) create mode 100644 website/docs/llms/anthropic.md create mode 100644 website/docs/llms/azure.md create mode 100644 website/docs/llms/cohere.md create mode 100644 website/docs/llms/deepseek.md create mode 100644 website/docs/llms/editing-format.md create mode 100644 website/docs/llms/gemini.md create mode 100644 website/docs/llms/groq.md create mode 100644 website/docs/llms/ollama.md create mode 100644 website/docs/llms/openai-compat.md create mode 100644 website/docs/llms/openai.md create mode 100644 website/docs/llms/openrouter.md create mode 100644 website/docs/llms/other.md create mode 100644 website/docs/llms/warnings.md diff --git a/README.md b/README.md index 97f445d6f3b..395799505d8 100644 --- a/README.md +++ b/README.md @@ -44,8 +44,8 @@ $ aider --opus - New features, changes, improvements, or bug fixes to your code. - New test cases, updated documentation or code refactors. - Paste in a GitHub issue url that needs to be solved. -- Aider will edit your files. -- Aider [automatically git commits changes](https://aider.chat/docs/faq.html#how-does-aider-use-git) with a sensible commit message. +- Aider will edit your files to complete your request. +- Aider [automatically git commits](https://aider.chat/docs/faq.html#how-does-aider-use-git) changes with a sensible commit message. - Aider works with [most popular languages](https://aider.chat/docs/languages.html): python, javascript, typescript, php, html, css, and more... - Aider works well with GPT-4o, Claude 3 Opus, GPT-3.5 and supports [connecting to many LLMs](https://aider.chat/docs/llms.html). - Aider can make coordinated changes across multiple files at once. @@ -55,6 +55,7 @@ Aider will notice and always use the latest version. So you can bounce back and forth between aider and your editor, to collaboratively code with AI. - Images can be added to the chat (GPT-4o, GPT-4 Turbo, etc). - URLs can be added to the chat and aider will read their content. +- [Code with your voice](https://aider.chat/docs/voice.html) using speech recognition. ## Documentation diff --git a/website/docs/llms.md b/website/docs/llms.md index 79da9a92c48..8849e54a80a 100644 --- a/website/docs/llms.md +++ b/website/docs/llms.md @@ -1,6 +1,7 @@ --- title: Connecting to LLMs nav_order: 70 +has_children: true --- # Aider can connect to most LLMs @@ -47,388 +48,6 @@ this is usually because the model isn't capable of properly returning "code edits". Models weaker than GPT 3.5 may have problems working well with aider. -## Configuring models -{: .no_toc } - -- TOC -{:toc} - -Aider uses the LiteLLM package to connect to LLM providers. -The [LiteLLM provider docs](https://docs.litellm.ai/docs/providers) -contain more detail on all the supported providers, -their models and any required environment variables. - -## OpenAI - -To work with OpenAI's models, you need to provide your -[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key) -either in the `OPENAI_API_KEY` environment variable or -via the `--openai-api-key` command line switch. - -Aider has some built in shortcuts for the most popular OpenAI models and -has been tested and benchmarked to work well with them: - -``` -pip install aider-chat - -export OPENAI_API_KEY= # Mac/Linux -setx OPENAI_API_KEY # Windows - -# GPT-4o is the best model, used by default -aider - -# GPT-4 Turbo (1106) -aider --4-turbo - -# GPT-3.5 Turbo -aider --35-turbo - -# List models available from OpenAI -aider --models openai/ -``` - -You can use `aider --model ` to use any other OpenAI model. -For example, if you want to use a specific version of GPT-4 Turbo -you could do `aider --model gpt-4-0125-preview`. - -## Anthropic - -To work with Anthropic's models, you need to provide your -[Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api) -either in the `ANTHROPIC_API_KEY` environment variable or -via the `--anthropic-api-key` command line switch. - -Aider has some built in shortcuts for the most popular Anthropic models and -has been tested and benchmarked to work well with them: - -``` -pip install aider-chat - -export ANTHROPIC_API_KEY= # Mac/Linux -setx ANTHROPIC_API_KEY # Windows - -# Claude 3 Opus -aider --opus - -# Claude 3 Sonnet -aider --sonnet - -# List models available from Anthropic -aider --models anthropic/ -``` - -You can use `aider --model ` to use any other Anthropic model. -For example, if you want to use a specific version of Opus -you could do `aider --model claude-3-opus-20240229`. - -## Gemini - -Google currently offers -[*free* API access to the Gemini 1.5 Pro model](https://ai.google.dev/pricing). -This is the most capable free model to use with aider, -with code editing capability that's comparable to GPT-3.5. -You'll need a [Gemini API key](https://aistudio.google.com/app/u/2/apikey). - -``` -pip install aider-chat - -export GEMINI_API_KEY= # Mac/Linux -setx GEMINI_API_KEY # Windows - -aider --model gemini/gemini-1.5-pro-latest - -# List models available from Gemini -aider --models gemini/ -``` - -## GROQ - -Groq currently offers *free* API access to the models they host. -The Llama 3 70B model works -well with aider and is comparable to GPT-3.5 in code editing performance. -You'll need a [Groq API key](https://console.groq.com/keys). - -To use **Llama3 70B**: - -``` -pip install aider-chat - -export GROQ_API_KEY= # Mac/Linux -setx GROQ_API_KEY # Windows - -aider --model groq/llama3-70b-8192 - -# List models available from Groq -aider --models groq/ -``` - - -## Cohere - -Cohere offers *free* API access to their models. -Their Command-R+ model works well with aider -as a *very basic* coding assistant. -You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login). - -To use **Command-R+**: - -``` -pip install aider-chat - -export COHERE_API_KEY= # Mac/Linux -setx COHERE_API_KEY # Windows - -aider --model command-r-plus - -# List models available from Cohere -aider --models cohere_chat/ -``` - -## Azure - -Aider can connect to the OpenAI models on Azure. - -``` -pip install aider-chat - -# Mac/Linux: -export AZURE_API_KEY= -export AZURE_API_VERSION=2023-05-15 -export AZURE_API_BASE=https://myendpt.openai.azure.com - -# Windows: -setx AZURE_API_KEY -setx AZURE_API_VERSION 2023-05-15 -setx AZURE_API_BASE https://myendpt.openai.azure.com - -aider --model azure/ - -# List models available from Azure -aider --models azure/ -``` - -## OpenRouter - -Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly): -You'll need an [OpenRouter API key](https://openrouter.ai/keys). - -``` -pip install aider-chat - -export OPENROUTER_API_KEY= # Mac/Linux -setx OPENROUTER_API_KEY # Windows - -# Or any other open router model -aider --model openrouter// - -# List models available from OpenRouter -aider --models openrouter/ -``` - -In particular, Llama3 70B works well with aider, at low cost: - -``` -pip install aider-chat - -export OPENROUTER_API_KEY= # Mac/Linux -setx OPENROUTER_API_KEY # Windows - -aider --model openrouter/meta-llama/llama-3-70b-instruct -``` - - -## Ollama - -Aider can connect to local Ollama models. - -``` -# Pull the model -ollama pull - -# Start your ollama server -ollama serve - -# In another terminal window... -pip install aider-chat - -export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux -setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows - -aider --model ollama/ -``` - -In particular, `llama3:70b` works very well with aider: - - -``` -ollama pull llama3:70b -ollama serve - -# In another terminal window... -export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux -setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows - -aider --model ollama/llama3:70b -``` - -Also see the [model warnings](#model-warnings) -section for information on warnings which will occur -when working with models that aider is not familiar with. - - -## Deepseek - -Aider can connect to the Deepseek.com API. -Deepseek appears to grant 5M tokens of free API usage to new accounts. - -``` -pip install aider-chat - -export DEEPSEEK_API_KEY= # Mac/Linux -setx DEEPSEEK_API_KEY # Windows - -# Use Deepseek Chat v2 -aider --model deepseek/deepseek-chat -``` - -See the [model warnings](#model-warnings) -section for information on warnings which will occur -when working with models that aider is not familiar with. - - -## OpenAI compatible APIs - -Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint. - -``` -pip install aider-chat - -# Mac/Linux: -export OPENAI_API_BASE= -export OPENAI_API_KEY= - -# Windows: -setx OPENAI_API_BASE -setx OPENAI_API_KEY - -# Prefix the model name with openai/ -aider --model openai/ -``` - -See the [model warnings](#model-warnings) -section for information on warnings which will occur -when working with models that aider is not familiar with. - -## Other LLMs - -Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package -to connect to hundreds of other models. -You can use `aider --model ` to use any supported model. - -To explore the list of supported models you can run `aider --models ` -with a partial model name. -If the supplied name is not an exact match for a known model, aider will -return a list of possible matching models. -For example: - -``` -$ aider --models turbo - -Aider v0.29.3-dev -Models which match "turbo": -- gpt-4-turbo-preview (openai/gpt-4-turbo-preview) -- gpt-4-turbo (openai/gpt-4-turbo) -- gpt-4-turbo-2024-04-09 (openai/gpt-4-turbo-2024-04-09) -- gpt-3.5-turbo (openai/gpt-3.5-turbo) -- ... -``` - -See the [list of providers supported by litellm](https://docs.litellm.ai/docs/providers) -for more details. - -## Model warnings - -Aider supports connecting to almost any LLM, -but it may not work well with less capable models. -If you see the model returning code, but aider isn't able to edit your files -and commit the changes... -this is usually because the model isn't capable of properly -returning "code edits". -Models weaker than GPT 3.5 may have problems working well with aider. - -Aider tries to sanity check that it is configured correctly -to work with the specified model: - -- It checks to see that all required environment variables are set for the model. These variables are required to configure things like API keys, API base URLs, etc. -- It checks a metadata database to look up the context window size and token costs for the model. - -Sometimes one or both of these checks will fail, so aider will issue -some of the following warnings. - -#### Missing environment variables - -``` -Model azure/gpt-4-turbo: Missing these environment variables: -- AZURE_API_BASE -- AZURE_API_VERSION -- AZURE_API_KEY -``` - -You need to set the listed environment variables. -Otherwise you will get error messages when you start chatting with the model. - - -#### Unknown which environment variables are required - -``` -Model gpt-5: Unknown which environment variables are required. -``` - -Aider is unable verify the environment because it doesn't know -which variables are required for the model. -If required variables are missing, -you may get errors when you attempt to chat with the model. -You can look in the -[litellm provider documentation](https://docs.litellm.ai/docs/providers) -to see if the required variables are listed there. - -#### Unknown model, did you mean? - -``` -Model gpt-5: Unknown model, context window size and token costs unavailable. -Did you mean one of these? -- gpt-4 -``` - -If you specify a model that aider has never heard of, you will get an -"unknown model" warning. -This means aider doesn't know the context window size and token costs -for that model. -Some minor functionality will be limited when using such models, but -it's not really a significant problem. - -Aider will also try to suggest similarly named models, -in case you made a typo or mistake when specifying the model name. - - -## Editing format - -Aider uses different "edit formats" to collect code edits from different LLMs. -The "whole" format is the easiest for an LLM to use, but it uses a lot of tokens -and may limit how large a file can be edited. -Models which can use one of the diff formats are much more efficient, -using far fewer tokens. -Models that use a diff-like format are able to -edit larger files with less cost and without hitting token limits. - -Aider is configured to use the best edit format for the popular OpenAI and Anthropic models -and the [other models recommended on the LLM page](https://aider.chat/docs/llms.html). -For lesser known models aider will default to using the "whole" editing format -since it is the easiest format for an LLM to use. - -If you would like to experiment with the more advanced formats, you can -use these switches: `--edit-format diff` or `--edit-format udiff`. - # Using a .env file Aider will read environment variables from a `.env` file in @@ -452,3 +71,8 @@ AZURE_API_BASE=https://example-endpoint.openai.azure.com OLLAMA_API_BASE=http://127.0.0.1:11434 ``` + + + + + diff --git a/website/docs/llms/anthropic.md b/website/docs/llms/anthropic.md new file mode 100644 index 00000000000..c2d2ff78fe5 --- /dev/null +++ b/website/docs/llms/anthropic.md @@ -0,0 +1,34 @@ +--- +parent: Connecting to LLMs +nav_order: 200 +--- + +# Anthropic + +To work with Anthropic's models, you need to provide your +[Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api) +either in the `ANTHROPIC_API_KEY` environment variable or +via the `--anthropic-api-key` command line switch. + +Aider has some built in shortcuts for the most popular Anthropic models and +has been tested and benchmarked to work well with them: + +``` +pip install aider-chat + +export ANTHROPIC_API_KEY= # Mac/Linux +setx ANTHROPIC_API_KEY # Windows + +# Claude 3 Opus +aider --opus + +# Claude 3 Sonnet +aider --sonnet + +# List models available from Anthropic +aider --models anthropic/ +``` + +You can use `aider --model ` to use any other Anthropic model. +For example, if you want to use a specific version of Opus +you could do `aider --model claude-3-opus-20240229`. diff --git a/website/docs/llms/azure.md b/website/docs/llms/azure.md new file mode 100644 index 00000000000..ada2beb8d8d --- /dev/null +++ b/website/docs/llms/azure.md @@ -0,0 +1,27 @@ +--- +parent: Connecting to LLMs +nav_order: 500 +--- + +# Azure + +Aider can connect to the OpenAI models on Azure. + +``` +pip install aider-chat + +# Mac/Linux: +export AZURE_API_KEY= +export AZURE_API_VERSION=2023-05-15 +export AZURE_API_BASE=https://myendpt.openai.azure.com + +# Windows: +setx AZURE_API_KEY +setx AZURE_API_VERSION 2023-05-15 +setx AZURE_API_BASE https://myendpt.openai.azure.com + +aider --model azure/ + +# List models available from Azure +aider --models azure/ +``` diff --git a/website/docs/llms/cohere.md b/website/docs/llms/cohere.md new file mode 100644 index 00000000000..948049cc1ae --- /dev/null +++ b/website/docs/llms/cohere.md @@ -0,0 +1,25 @@ +--- +parent: Connecting to LLMs +nav_order: 500 +--- + +# Cohere + +Cohere offers *free* API access to their models. +Their Command-R+ model works well with aider +as a *very basic* coding assistant. +You'll need a [Cohere API key](https://dashboard.cohere.com/welcome/login). + +To use **Command-R+**: + +``` +pip install aider-chat + +export COHERE_API_KEY= # Mac/Linux +setx COHERE_API_KEY # Windows + +aider --model command-r-plus + +# List models available from Cohere +aider --models cohere_chat/ +``` diff --git a/website/docs/llms/deepseek.md b/website/docs/llms/deepseek.md new file mode 100644 index 00000000000..af086a99d42 --- /dev/null +++ b/website/docs/llms/deepseek.md @@ -0,0 +1,24 @@ +--- +parent: Connecting to LLMs +nav_order: 500 +--- + +# Deepseek + +Aider can connect to the Deepseek.com API. +Deepseek appears to grant 5M tokens of free API usage to new accounts. + +``` +pip install aider-chat + +export DEEPSEEK_API_KEY= # Mac/Linux +setx DEEPSEEK_API_KEY # Windows + +# Use Deepseek Chat v2 +aider --model deepseek/deepseek-chat +``` + +See the [model warnings](warnings.html) +section for information on warnings which will occur +when working with models that aider is not familiar with. + diff --git a/website/docs/llms/editing-format.md b/website/docs/llms/editing-format.md new file mode 100644 index 00000000000..018b82f791a --- /dev/null +++ b/website/docs/llms/editing-format.md @@ -0,0 +1,22 @@ +--- +parent: Connecting to LLMs +nav_order: 850 +--- + +# Editing format + +Aider uses different "edit formats" to collect code edits from different LLMs. +The "whole" format is the easiest for an LLM to use, but it uses a lot of tokens +and may limit how large a file can be edited. +Models which can use one of the diff formats are much more efficient, +using far fewer tokens. +Models that use a diff-like format are able to +edit larger files with less cost and without hitting token limits. + +Aider is configured to use the best edit format for the popular OpenAI and Anthropic models +and the [other models recommended on the LLM page](https://aider.chat/docs/llms.html). +For lesser known models aider will default to using the "whole" editing format +since it is the easiest format for an LLM to use. + +If you would like to experiment with the more advanced formats, you can +use these switches: `--edit-format diff` or `--edit-format udiff`. diff --git a/website/docs/llms/gemini.md b/website/docs/llms/gemini.md new file mode 100644 index 00000000000..1a069a627ed --- /dev/null +++ b/website/docs/llms/gemini.md @@ -0,0 +1,25 @@ +--- +parent: Connecting to LLMs +nav_order: 300 +--- + +# Gemini + +Google currently offers +[*free* API access to the Gemini 1.5 Pro model](https://ai.google.dev/pricing). +This is the most capable free model to use with aider, +with code editing capability that's comparable to GPT-3.5. +You'll need a [Gemini API key](https://aistudio.google.com/app/u/2/apikey). + +``` +pip install aider-chat + +export GEMINI_API_KEY= # Mac/Linux +setx GEMINI_API_KEY # Windows + +aider --model gemini/gemini-1.5-pro-latest + +# List models available from Gemini +aider --models gemini/ +``` + diff --git a/website/docs/llms/groq.md b/website/docs/llms/groq.md new file mode 100644 index 00000000000..96c901423ac --- /dev/null +++ b/website/docs/llms/groq.md @@ -0,0 +1,27 @@ +--- +parent: Connecting to LLMs +nav_order: 400 +--- + +# GROQ + +Groq currently offers *free* API access to the models they host. +The Llama 3 70B model works +well with aider and is comparable to GPT-3.5 in code editing performance. +You'll need a [Groq API key](https://console.groq.com/keys). + +To use **Llama3 70B**: + +``` +pip install aider-chat + +export GROQ_API_KEY= # Mac/Linux +setx GROQ_API_KEY # Windows + +aider --model groq/llama3-70b-8192 + +# List models available from Groq +aider --models groq/ +``` + + diff --git a/website/docs/llms/ollama.md b/website/docs/llms/ollama.md new file mode 100644 index 00000000000..39f8c5f3831 --- /dev/null +++ b/website/docs/llms/ollama.md @@ -0,0 +1,43 @@ +--- +parent: Connecting to LLMs +nav_order: 500 +--- + +# Ollama + +Aider can connect to local Ollama models. + +``` +# Pull the model +ollama pull + +# Start your ollama server +ollama serve + +# In another terminal window... +pip install aider-chat + +export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux +setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows + +aider --model ollama/ +``` + +In particular, `llama3:70b` works well with aider: + + +``` +ollama pull llama3:70b +ollama serve + +# In another terminal window... +export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux +setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows + +aider --model ollama/llama3:70b +``` + +See the [model warnings](warnings.html) +section for information on warnings which will occur +when working with models that aider is not familiar with. + diff --git a/website/docs/llms/openai-compat.md b/website/docs/llms/openai-compat.md new file mode 100644 index 00000000000..a464ab345dc --- /dev/null +++ b/website/docs/llms/openai-compat.md @@ -0,0 +1,27 @@ +--- +parent: Connecting to LLMs +nav_order: 500 +--- + +# OpenAI compatible APIs + +Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint. + +``` +pip install aider-chat + +# Mac/Linux: +export OPENAI_API_BASE= +export OPENAI_API_KEY= + +# Windows: +setx OPENAI_API_BASE +setx OPENAI_API_KEY + +# Prefix the model name with openai/ +aider --model openai/ +``` + +See the [model warnings](warnings.html) +section for information on warnings which will occur +when working with models that aider is not familiar with. diff --git a/website/docs/llms/openai.md b/website/docs/llms/openai.md new file mode 100644 index 00000000000..40fee02e4f7 --- /dev/null +++ b/website/docs/llms/openai.md @@ -0,0 +1,37 @@ +--- +parent: Connecting to LLMs +nav_order: 100 +--- + +# OpenAI + +To work with OpenAI's models, you need to provide your +[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key) +either in the `OPENAI_API_KEY` environment variable or +via the `--openai-api-key` command line switch. + +Aider has some built in shortcuts for the most popular OpenAI models and +has been tested and benchmarked to work well with them: + +``` +pip install aider-chat + +export OPENAI_API_KEY= # Mac/Linux +setx OPENAI_API_KEY # Windows + +# GPT-4o is the best model, used by default +aider + +# GPT-4 Turbo (1106) +aider --4-turbo + +# GPT-3.5 Turbo +aider --35-turbo + +# List models available from OpenAI +aider --models openai/ +``` + +You can use `aider --model ` to use any other OpenAI model. +For example, if you want to use a specific version of GPT-4 Turbo +you could do `aider --model gpt-4-0125-preview`. diff --git a/website/docs/llms/openrouter.md b/website/docs/llms/openrouter.md new file mode 100644 index 00000000000..8bf224d53b9 --- /dev/null +++ b/website/docs/llms/openrouter.md @@ -0,0 +1,35 @@ +--- +parent: Connecting to LLMs +nav_order: 500 +--- + +# OpenRouter + +Aider can connect to [models provided by OpenRouter](https://openrouter.ai/models?o=top-weekly): +You'll need an [OpenRouter API key](https://openrouter.ai/keys). + +``` +pip install aider-chat + +export OPENROUTER_API_KEY= # Mac/Linux +setx OPENROUTER_API_KEY # Windows + +# Or any other open router model +aider --model openrouter// + +# List models available from OpenRouter +aider --models openrouter/ +``` + +In particular, Llama3 70B works well with aider, at low cost: + +``` +pip install aider-chat + +export OPENROUTER_API_KEY= # Mac/Linux +setx OPENROUTER_API_KEY # Windows + +aider --model openrouter/meta-llama/llama-3-70b-instruct +``` + + diff --git a/website/docs/llms/other.md b/website/docs/llms/other.md new file mode 100644 index 00000000000..e0a13b4ad04 --- /dev/null +++ b/website/docs/llms/other.md @@ -0,0 +1,39 @@ +--- +parent: Connecting to LLMs +nav_order: 800 +--- + +# Other LLMs + +Aider uses the [litellm](https://docs.litellm.ai/docs/providers) package +to connect to hundreds of other models. +You can use `aider --model ` to use any supported model. + +To explore the list of supported models you can run `aider --models ` +with a partial model name. +If the supplied name is not an exact match for a known model, aider will +return a list of possible matching models. +For example: + +``` +$ aider --models turbo + +Aider v0.29.3-dev +Models which match "turbo": +- gpt-4-turbo-preview (openai/gpt-4-turbo-preview) +- gpt-4-turbo (openai/gpt-4-turbo) +- gpt-4-turbo-2024-04-09 (openai/gpt-4-turbo-2024-04-09) +- gpt-3.5-turbo (openai/gpt-3.5-turbo) +- ... +``` + +See the [model warnings](warnings.html) +section for information on warnings which will occur +when working with models that aider is not familiar with. + +## LiteLLM + +Aider uses the LiteLLM package to connect to LLM providers. +The [LiteLLM provider docs](https://docs.litellm.ai/docs/providers) +contain more detail on all the supported providers, +their models and any required environment variables. diff --git a/website/docs/llms/warnings.md b/website/docs/llms/warnings.md new file mode 100644 index 00000000000..d96bb6d44dd --- /dev/null +++ b/website/docs/llms/warnings.md @@ -0,0 +1,70 @@ +--- +parent: Connecting to LLMs +nav_order: 900 +--- + +# Model warnings + +Aider supports connecting to almost any LLM, +but it may not work well with less capable models. +If you see the model returning code, but aider isn't able to edit your files +and commit the changes... +this is usually because the model isn't capable of properly +returning "code edits". +Models weaker than GPT 3.5 may have problems working well with aider. + +Aider tries to sanity check that it is configured correctly +to work with the specified model: + +- It checks to see that all required environment variables are set for the model. These variables are required to configure things like API keys, API base URLs, etc. +- It checks a metadata database to look up the context window size and token costs for the model. + +Sometimes one or both of these checks will fail, so aider will issue +some of the following warnings. + +## Missing environment variables + +``` +Model azure/gpt-4-turbo: Missing these environment variables: +- AZURE_API_BASE +- AZURE_API_VERSION +- AZURE_API_KEY +``` + +You need to set the listed environment variables. +Otherwise you will get error messages when you start chatting with the model. + + +## Unknown which environment variables are required + +``` +Model gpt-5: Unknown which environment variables are required. +``` + +Aider is unable verify the environment because it doesn't know +which variables are required for the model. +If required variables are missing, +you may get errors when you attempt to chat with the model. +You can look in the +[litellm provider documentation](https://docs.litellm.ai/docs/providers) +to see if the required variables are listed there. + +## Unknown model, did you mean? + +``` +Model gpt-5: Unknown model, context window size and token costs unavailable. +Did you mean one of these? +- gpt-4 +``` + +If you specify a model that aider has never heard of, you will get an +"unknown model" warning. +This means aider doesn't know the context window size and token costs +for that model. +Some minor functionality will be limited when using such models, but +it's not really a significant problem. + +Aider will also try to suggest similarly named models, +in case you made a typo or mistake when specifying the model name. + +