Skip to content

Miniflux with AI. Add AI summaries, translations, and AI news based on RSS content

Notifications You must be signed in to change notification settings

Qetesh/miniflux-ai

Repository files navigation

miniflux-ai

Miniflux with AI

This project integrates with Miniflux to fetch RSS feed content via API or webhook. It then utilizes large language models (e.g., Ollama, ChatGPT, LLaMA, Gemini) to generate summaries, translations, and AI-driven news insights.

Features

  • Miniflux Integration: Seamlessly fetch unread entries from Miniflux or trigger via webhook.
  • LLM Processing: Generate summaries, translations, etc. based on your chosen LLM agent.
  • AI News: Use the LLM agent to generate AI morning and evening news from feed content.
  • Flexible Configuration: Easily modify or add new agents via the config.yml file.
  • Markdown and HTML Support: Outputs in Markdown or styled HTML blocks, depending on configuration.
summaries, translations AI News
miniflux AI summaries translations miniflux AI summaries translations

Requirements

  • Python 3.11+
  • Dependencies: Install via pip install -r requirements.txt
  • Miniflux API Key
  • API Key compatible with OpenAI-compatible LLMs (e.g., Ollama for LLaMA 3.1)

Configuration

The repository includes a template configuration file: config.sample.yml. Modify the config.yml to set up:

If using a webhook, enter the URL in Settings > Integrations > Webhook > Webhook URL.

If deploying in a container alongside Miniflux, use the following URL: http://miniflux_ai/api/miniflux-ai.

  • Miniflux: Base URL and API key.
  • LLM: Model settings, API key, and endpoint.Add timeout, max_workers parameters due to multithreading
  • AI News: Schedule and prompts for daily news generation
  • Agents: Define each agent's prompt, allow_list/deny_list filters, and output style(style_block parameter controls whether the output is formatted as a code block in Markdown).

Docker Setup

The project includes a docker-compose.yml file for easy deployment:

If using webhook or AI news, it is recommended to use the same docker-compose.yml with miniflux and access it via container name.

version: '3.3'
services:
    miniflux_ai:
        container_name: miniflux_ai
        image: ghcr.io/qetesh/miniflux-ai:latest
        restart: always
        environment:
            TZ: Asia/Shanghai
        volumes:
            - ./config.yml:/app/config.yml

Refer to config.sample.*.yml, create config.yml To start the services:

docker-compose up -d

Usage

  1. Ensure config.yml is properly configured.
  2. Run the script: python main.py
  3. The script will fetch unread RSS entries, process them with the LLM, and update the content in Miniflux.

Roadmap

  • Add daily summary(by title, Summary of existing AI)
    • Add Morning and Evening News(e.g. 9/24: AI Morning News, 9/24: AI Evening News)
    • Add timed summary

FAQ

If the formatting of summary content is incorrect, add the following code in Settings > Custom CSS: ``` pre code { white-space: pre-wrap; word-wrap: break-word; } ```

Contributing

Feel free to fork this repository and submit pull requests. Contributions and issues are welcome!

Star History

Star History Chart

License

This project is licensed under the MIT License.