Skip to content

jokemanfire/tabby

Repository files navigation

🐾 Tabby

build status Docker pulls License Slack Community

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open in Playground

Demo

👀 What's New

👋 Getting Started

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model TabbyML/SantaCoder-1B --device cuda

For additional options (e.g inference type, parallelism), please refer to the documentation at https://tabbyml.github.io/tabby.

🤝 Contributing

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt-get install protobuf-compiler libopenblas-dev
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌟 Star History

Star History Chart

About

Self-hosted AI coding assistant

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 90.4%
  • Python 6.2%
  • HTML 1.7%
  • TypeScript 0.6%
  • Scheme 0.5%
  • Shell 0.3%
  • Other 0.3%