Simple, unified interface to multiple Generative AI providers.
aisuite
is an tool designed for researchers who need to evaluate and compare the responses of
multiple LLMs through a standardized interface. Based on the OpenAI interface standard, aisuite
makes it easy to interact with the most popular LLMs and compare the results of their chat based
functionality, with support for more interfaces coming in the near future.
Currently supported providers are - OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace and Ollama. Internally, aisuite uses either the HTTP endpoint or the SDK for making calls to the provider.
This installs just the base package without installing any provider's SDK.
pip install aisuite
This installs anthropic along with anthropic
pip install aisuite[anthropic]
This installs all the provider specific libraries
pip install aisuite[all]
This library provides a thin wrapper around python client libraries to interact with various Generative AI providers allowing creators to seamlessly swap out or test responses from a number of LLMs without changing their code.
To get started you will need the API Keys for the providers you intend to use. You also need to install the provider specific library to use either separately or when installing aisuite.
The API Keys are expected to be in the host ENV and can be set manually or by using a tool such
as python-dotenv
or direnv
.
For example if you wanted to use Antrophic's Claude 3.5 Sonnet in addition to OpenAI's ChatGPT 4o you would first need to set the API keys:
export ANTHROPIC_API_KEY="your-anthropic-api-key"
export OPENAI_API_KEY="your-openai-api-key"
In your python code:
import aisuite as ai
client = ai.Client()
models = ["openai:gpt-4o", "anthropic:claude-3-5-sonnet-20240620"]
messages = [
{"role": "system", "content": "Respond in Pirate English."},
{"role": "user", "content": "Tell me a joke."},
]
for model in models:
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
print(response.choices[0].message.content)
Note that the model name in the create() call needs to be replaced with <provider>:<model-name>
aisuite will call the appropriate provider with the right parameters based on the provider value.
For more examples, check out the examples
directory where you will find several
notebooks that you can run to experiment with the interface.
aisuite is released under the MIT License. You are free to use, modify, and distribute the code for both commercial and non-commercial purposes.
If you would like to contribute, please read our Contributing Guide and join our Discord server!
We have made easy for a provider or volunteer to add support for a new platform.
A convention-based approach is followed for loading providers, which relies on strict naming conventions for both the module name and the class name. The format to follow is based on the model identifier in the form of provider:model
.
- The provider's module file must be named in the format
<provider>_provider.py
. - The class inside this module must follow the format: the provider name with the first letter capitalized, followed by the suffix
Provider
.
-
AWS: The provider class should be defined as:
class AwsProvider(BaseProvider)
in providers/aws_provider.py.
-
OpenAI: The provider class should be defined as:
class OpenaiProvider(BaseProvider)
in providers/openai_provider.py
This convention simplifies the addition of new providers and ensures consistency across provider implementations.