Skip to content

This plugin is a hub for setting AI providers (OpenAI-like, Ollama and more) in one place.

License

Notifications You must be signed in to change notification settings

pfrankov/obsidian-ai-providers

Repository files navigation

Obsidian AI Providers

⚠️ Important Note: This plugin is a configuration tool - it helps you manage your AI settings in one place.

Think of it like a control panel where you can:

  • Store your API keys and settings for AI services
  • Share these settings with other Obsidian plugins
  • Avoid entering the same AI settings multiple times

The plugin itself doesn't do any AI processing - it just helps other plugins connect to AI services more easily.

image

Required by plugins

Supported providers

  • Ollama
  • OpenAI
  • OpenAI compatible API
  • OpenRouter
  • Google Gemini
  • LM Studio
  • Groq

Features

  • Fully encapsulated API for working with AI providers
  • Develop AI plugins faster without dealing directly with provider-specific APIs
  • Easily extend support for additional AI providers in your plugin
  • Available in 4 languages: English, Chinese, German, and Russian (more languages coming soon)

Installation

Obsidian plugin store (recommended)

This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=ai-providers

BRAT

You can install this plugin via BRAT: pfrankov/obsidian-ai-providers

Create AI provider

Ollama

  1. Install Ollama.
  2. Install Gemma 2 ollama pull gemma2 or any preferred model from the library.
  3. Select Ollama in Provider type
  4. Click refresh button and select the model that suits your needs (e.g. gemma2)

Additional: if you have issues with streaming completion with Ollama try to set environment variable OLLAMA_ORIGINS to *:

  • For MacOS run launchctl setenv OLLAMA_ORIGINS "*".
  • For Linux and Windows check the docs.

OpenAI

  1. Select OpenAI in Provider type
  2. Set Provider URL to https://api.openai.com/v1
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. gpt-4o)

OpenAI compatible server

There are several options to run local OpenAI-like server:

OpenRouter

  1. Select OpenRouter in Provider type
  2. Set Provider URL to https://openrouter.ai/api/v1
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. anthropic/claude-3.7-sonnet)

Google Gemini

  1. Select Google Gemini in Provider type
  2. Set Provider URL to https://generativelanguage.googleapis.com/v1beta/openai
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. gemini-1.5-flash)

LM Studio

  1. Select LM Studio in Provider type
  2. Set Provider URL to http://localhost:1234/v1
  3. Click refresh button and select the model that suits your needs (e.g. gemma2)

Groq

  1. Select Groq in Provider type
  2. Set Provider URL to https://api.groq.com/openai/v1
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. llama3-70b-8192)

For plugin developers

Docs: How to integrate AI Providers in your plugin.

Roadmap

  • Docs for devs
  • Ollama context optimizations
  • Image processing support
  • OpenRouter Provider support
  • Gemini Provider support
  • LM Studio Provider support
  • Groq Provider support
  • Anthropic Provider support
  • Shared embeddings to avoid re-embedding the same documents multiple times
  • Spanish, Italian, French, Dutch, Portuguese, Japanese, Korean translations
  • Incapsulated basic RAG search with optional BM25 search

My other Obsidian plugins

  • Local GPT that assists with local AI for maximum privacy and offline access.
  • Colored Tags that colorizes tags in distinguishable colors.

About

This plugin is a hub for setting AI providers (OpenAI-like, Ollama and more) in one place.

Resources

License

Stars

Watchers

Forks