Auto-generate detailed and structured README files, powered by AI.
Objective
Readme-ai is a developer tool that auto-generates README.md files using a combination of data extraction and generative ai. Simply provide a repository URL or local path to your codebase and a well-structured and detailed README file will be generated for you.
Motivation
Streamlines documentation creation and maintenance, enhancing developer productivity. This project aims to enable all skill levels, across all domains, to better understand, use, and contribute to open-source software.
Important
This project is currently under development with an opinionated configuration and setup. It is vital to review all text generated by the LLM APIs to ensure it accurately represents your project.
Standard CLI usage with an OpenAI API key (recommended).
readmeai-cli-demo.1.mov
You can also generate README files without an API key by using the --offline
CLI option.
readmeai-cli-offline-demo.1.mov
Tip
Offline mode is useful for quickly generating a boilerplate README without incurring API costs. See an offline mode README file here.
Built with flexibility in mind, readme-ai allows users to customize various aspects of the README file using CLI options and configuration settings. Content is generated using a combination of data extraction and making a few calls to LLM APIs.
Currently, readme-ai uses generative ai to create four distinct sections of the README file.
i. Header: Project slogan that describes the repository in an engaging way.
ii. Overview: Provides an intro to the project's core use-case and value proposition.
iii. Features: Markdown table containing details about the project's technical components.
iv. Modules: Codebase file summaries are generated and formatted into markdown tables.
All other content is extracted from processing and analyzing repository metadata and files.
The header section is built using repository metadata and CLI options. Key features include:
- Badges: Svg icons that represent codebase metadata, provided by shields.io and skill-icons.
- Project Logo: Select a project logo image from the base set or provide your image.
- Project Slogan: Catch phrase that describes the project, generated by generative ai.
- Table of Contents/Quick Links: Links to the different sections of the README file.
Below are a few examples of README headers generated by the readme-ai tool.
See the Configuration section below for the complete list of CLI options and settings.
📑 Codebase Documentation
Repository Structure A directory tree structure is created and displayed in the README. Implemented using pure Python (tree.py). |
Codebase Summaries
File summaries generated using LLM APIs, and are formatted and grouped by directory in markdown tables. |
📍 Overview & Features Table
The overview and features sections are generated using OpenAI's API. Structured prompt templates are injected with repository metadata to help produce more accurate and relevant content.
🚀 Dynamic Quick Start Guides
Getting Started or Quick Start Generates structured guides for installing, running, and testing your project. These steps are created by identifying dependencies and languages used in the codebase, and mapping this data to configuration files such as the language_setup.toml file. |
🤝 Contributing Guidelines, License, & More
Additional Sections The remaining README sections are built from a baseline template that includes common sections such as |
🧩 Templates
This feature is currently under development. The template system will allow users to generate README files in different flavors, such as ai, data, web development, etc.
|
🎨 Examples
Output File | Repository | Languages | |
---|---|---|---|
1️⃣ | readme-python.md | readme-ai | Python |
2️⃣ | readme-typescript.md | chatgpt-app-react-typescript | TypeScript, React |
3️⃣ | readme-javascript.md | (repository deleted) | JavaScript, React |
4️⃣ | readme-kotlin.md | file.io-android-client | Kotlin, Java, Android |
5️⃣ | readme-rust-c.md | rust-c-app | C, Rust |
6️⃣ | readme-go.md | go-docker-app | Go |
7️⃣ | readme-java.md | java-minimal-todo | Java |
8️⃣ | readme-fastapi-redis.md | async-ml-inference | Python, FastAPI, Redis |
9️⃣ | readme-mlops.md | mlops-course | Python, Jupyter |
🔟 | readme-pyflink.md | flink-flow | PyFlink |
Requirements
- Python: 3.9+
- Package manager or container runtime:
pip
ordocker
recommended. - OpenAI API account and API key (other providers coming soon)
Repository
A repository URL or local path to your codebase is required run readme-ai. The following are supported:
OpenAI API Key
An OpenAI API account and API key are needed to use readme-ai. The following steps outline the process.
🔐 OpenAI API Account Setup
- Go to the OpenAI website.
- Click the "Sign up for free" button.
- Fill out the registration form with your information and agree to the terms of service.
- Once logged in, click on the "API" tab.
- Follow the instructions to create a new API key.
- Copy the API key and keep it in a secure place.
Warning
Before using readme-ai, its essential to understand the potential risks and costs associated with using AI-powered tools.
-
Review Sensitive Information: Ensure all content in your repository is free of sensitive information before running the tool. This project does not remove sensitive data from your codebase, nor from the output README file.
-
API Usage Costs: The OpenAI API is not free and costs can accumulate quickly! You will be charged for each request made by readme-ai. Be sure to monitor API usage costs using the OpenAI API Usage Dashboard.
Using pip
pip install readmeai
Using docker
docker pull zeroxeli/readme-ai:latest
Using conda
conda install -c conda-forge readmeai
Alternatively, clone the readme-ai repository and build from source.
git clone https://github.com/eli64s/readme-ai && \
cd readme-ai
Then use one of the methods below to install the project's dependencies (Bash, Conda, Pipenv, or Poetry).
Using bash
bash setup/setup.sh
Using pipenv
pipenv install && \
pipenv shell
Using poetry
poetry install && \
poetry shell
Before running the application, ensure you have an OpenAI API key and its set as an environment variable.
On Linux
or MacOS
$ export OPENAI_API_KEY=YOUR_API_KEY
On Windows
$ set OPENAI_API_KEY=YOUR_API_KEY
Use one of the methods below to run the application (Pip, Docker, Conda, Streamlit, etc).
Using pip
readmeai --repository https://github.com/eli64s/readme-ai
Using docker
docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
-r https://github.com/eli64s/readme-ai
Using conda
readmeai -r https://github.com/eli64s/readme-ai
Using streamlit
Note
The web app is hosted on Streamlit Community Cloud, a free service for sharing Streamlit apps. Thus, the app may be unstable or unavailable at times. See the readme-ai-streamlit repository for more details.
Alternatively, run the application locally from the cloned repository.
Using pipenv
pipenv shell && \
python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai
Using poetry
poetry shell && \
poetry run python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai
Use pytest
to run the default test suite.
make test
Use nox
to run the test suite against multiple Python versions including (3.9, 3.10, 3.11, 3.12)
.
nox -f noxfile.py
Run the readmeai
command in your terminal with the following options to tailor your README file.
Flag (Long/Short) | Default | Description | Type | Status |
---|---|---|---|---|
--align /-a |
center |
Set header text alignment (left , center ). |
String | Optional |
--api-key /-k |
OPENAI_API_KEY env var |
Your GPT model API key. | String | Optional |
--badges /-b |
default |
Badge style options for your README file. | String | Optional |
--emojis /-e |
False |
Add emojis to section header tiles. | Boolean | Optional |
--image /-i |
default |
Project logo image displayed in README header. | String | Optional |
--max-tokens |
3899 |
Max number of tokens that can be generated. | Integer | Optional |
--model /-m |
gpt-3.5-turbo |
Select GPT model for content generation. | String | Optional |
--offline |
False |
Generate a README without an API key. | Boolean | Optional |
--output /-o |
readme-ai.md |
README output file name. | Path/String | Optional |
--repository /-r |
None | Repository URL or local path. | URL/String | Required |
--temperature /-t |
0.8 |
LLM API creativity level. | Float | Optional |
--template |
None | Choose README template. | String | WIP |
--language /-l |
English (en) |
Language for content. | String | WIP |
WIP = work in progress, or feature currently under development.
For additional command-line information, run readmeai --help
in your terminal for more details about each option.
Badge Icons
Select your preferred badge icon style to display in your output file using the --badges
flag. The default badge style displays basic metadata about your repository using shields.io badges. If you select another option, the default
badges will be automatically included.
Options | Preview |
---|---|
default |
|
flat |
|
flat-square |
|
for-the-badge |
|
plastic |
|
skills |
|
skills-light |
|
social |
Project Logo
Select an image to display in your README header section using the --image
flag.
Image | Default | Black | Grey | Purple | Yellow |
---|---|---|---|---|---|
Preview |
To provide your own image, use the CLI option --image custom
and you will be prompted to enter a URL to your image.
The readme-ai tool is designed with flexibility in mind, allowing users to configure various aspects of its operation through a series of models and settings. The configuration file covers aspects such as language model settings, git host providers, repository details, markdown templates, and more.
🔠 Configuration Models
GitService Enum
- Purpose: Defines Git service details.
- Attributes:
LOCAL
,GITHUB
,GITLAB
,BITBUCKET
: Enumerations for different Git services.host
: Service host URL.api_url
: API base URL for the service.file_url
: URL format for accessing files in the repository.
BadgeOptions Enum
- Purpose: Provides options for README file badge icons.
- Options:
FLAT
,FLAT_SQUARE
,FOR_THE_BADGE
,PLASTIC
,SKILLS
,SKILLS_LIGHT
,SOCIAL
.
ImageOptions Enum
- Purpose: Lists CLI options for README file header images.
- Options:
CUSTOM
,BLACK
,BLUE
,GRADIENT
,PURPLE
,YELLOW
.
CliSettings
- Purpose: Defines CLI options for the application.
- Fields:
emojis
: Enables or disables emoji usage.offline
: Indicates offline mode operation.
FileSettings
- Purpose: Configuration for various file paths used in the application.
- Fields:
dependency_files
,identifiers
,ignore_files
,language_names
,language_setup
,output
,shields_icons
,skill_icons
.
GitSettings
- Purpose: Manages repository settings and validations.
- Fields:
repository
: The repository URL or path.source
: The source of the Git repository.name
: The name of the repository.
LlmApiSettings
- Purpose: Holds settings for OpenAI's LLM API.
- Fields:
content
,endpoint
,encoding
,model
,rate_limit
,temperature
,tokens
,tokens_max
.
MarkdownSettings
- Purpose: Contains Markdown templates for different sections of a README.
- Fields: Templates for aligning text, badges, headers, images, features, getting started, overview, tables of contents, etc.
PromptSettings
- Purpose: Configures prompts for OpenAI's LLM API.
- Fields:
features
,overview
,slogan
,summaries
.
AppConfig
- Purpose: Nested model encapsulating all application configurations.
- Fields:
cli
,files
,git
,llm
,md
,prompts
.
AppConfigModel
- Purpose: Pydantic model for the entire application configuration.
- Sub-models:
AppConfig
.
ConfigHelper
- Purpose: Assists in loading additional configuration files.
- Methods:
load_helper_files
to load configuration from different files.
Functions
_get_config_dict
- Purpose: Retrieves configuration data from TOML files.
- Parameters:
handler
: Instance ofFileHandler
.file_path
: Path to the configuration file.
load_config
- Purpose: Loads the main configuration file.
- Parameters:
path
: Path to the configuration file.
- Returns: An instance of
AppConfig
.
load_config_helper
- Purpose: Loads multiple configuration helper files.
- Parameters:
conf
: An instance ofAppConfigModel
.
- Returns: An instance of
ConfigHelper
.
Usage
The configurations are loaded using the load_config
function, which parses a TOML file into the AppConfigModel
. This model is then used throughout the application to access various settings. Additional helper files can be loaded using ConfigHelper
, which further enriches the application's configuration context.
- Publish readme-ai CLI as a Python package on PyPI.
- Containerize the readme-ai CLI as a Docker image via Docker Hub.
- Serve the readme-ai CLI as a web app, deployed on Streamlit Community Cloud.
- Integrate singular interface for all LLM API providers (Anthropic, Cohere, Gemini, etc.)
- Design template system to give users a variety of README document flavors (ai, data, web, etc.)
- Develop robust documentation generation process to extend to full project docs (i.e. Sphinx, MkDocs, etc.)
- Add support for generating README files in any language (i.e. CN, ES, FR, JA, KO, RU).
- Create GitHub Actions script to automatically update README file content on repository push.
Badges