Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:
Self-contained, with no need for a DBMS or cloud service.
OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
Supports consumer-grade GPUs.
🔥 What's New
05/25/2025 💡Interested in joining Agent private preview? DM in X for early waitlist approval!🎫
05/20/2025 Enhance Tabby with your own documentation📃 through REST APIs in v0.29 ! 🎉
05/01/2025 v0.28 transforming Answer Engine messages into persistent, shareable Pages
03/31/2025 v0.27 released with a richer @
menu in the chat side panel.
02/05/2025 LDAP Authentication and better notification for background jobs coming in Tabby v0.24.0 !✨
02/04/2025 VSCode 1.20.0 upgrade! @-mention files to add them as chat context, and edit inline with a new right-click option are available!
Archived
01/10/2025 Tabby v0.23.0 featuring enhanced code browser experience and chat side panel improvements!
12/24/2024 Introduce Notification Box in Tabby v0.22.0 !
12/06/2024 Llamafile deployment integration and enhanced Answer Engine user experience are coming in Tabby v0.21.0 !🚀
11/10/2024 Switching between different backend chat models is supported in Answer Engine with Tabby v0.20.0 !
10/30/2024 Tabby v0.19.0 featuring recent shared threads on the main page to improve their discoverability.
07/09/2024 🎉Announce Codestral integration in Tabby !
07/05/2024 Tabby v0.13.0 introduces Answer Engine , a central knowledge engine for internal engineering teams. It seamlessly integrates with dev team's internal data, delivering reliable and precise answers to empower developers.
06/13/2024 VSCode 1.7 marks a significant milestone with a versatile Chat experience throughout your coding experience. Come and they the latest chat in side-panel and editing via chat command !
06/10/2024 Latest 📃blogpost drop on an enhanced code context understanding in Tabby!
06/06/2024 Tabby v0.12.0 release brings 🔗seamless integrations (Gitlab SSO, Self-hosted GitHub/GitLab, etc.), to ⚙️flexible configurations (HTTP API integration) and 🌐expanded capabilities (repo-context in Code Browser)!
05/22/2024 Tabby VSCode 1.6 comes with multiple choices in inline completion, and the auto-generated commit messages 🐱💻!
05/11/2024 v0.11.0 brings significant enterprise upgrades, including 📊storage usage stats, 🔗GitHub & GitLab integration, 📋Activities page, and the long-awaited 🤖Ask Tabby feature!
04/22/2024 v0.10.0 released, featuring the latest Reports tab with team-wise analytics for Tabby usage.
04/19/2024 📣 Tabby now incorporates locally relevant snippets (declarations from local LSP, and recently modified code) for code completion!
04/17/2024 CodeGemma and CodeQwen model series have now been added to the official registry !
03/20/2024 v0.9 released, highlighting a full feature admin UI.
12/23/2023 Seamlessly deploy Tabby on any cloud with SkyServe 🛫 from SkyPilot.
12/15/2023 v0.7.0 released with team management and secured access!
10/15/2023 RAG-based code completion is enabled by detail in v0.3.0 🎉! Check out the blogpost explaining how Tabby utilizes repo-level context to get even smarter!
11/27/2023 v0.6.0 released!
11/09/2023 v0.5.5 released! With a redesign of UI + performance improvement.
10/24/2023 ⛳️ Major updates for Tabby IDE plugins across VSCode/Vim/IntelliJ !
10/04/2023 Check out the model directory for the latest models supported by Tabby.
09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1 !
08/31/2023 Tabby's first stable release v0.0.1 🥳.
08/28/2023 Experimental support for the CodeLlama 7B .
08/24/2023 Tabby is now on JetBrains Marketplace !
👋 Getting Started
You can find our documentation here .
Run Tabby in 1 Minute
The easiest way to start a Tabby server is by using the following Docker command:
docker run -it \
--gpus all -p 8080:8080 -v $HOME /.tabby:/data \
tabbyml/tabby \
serve --model StarCoder-1B --device cuda --chat-model Qwen2-1.5B-Instruct
For additional options (e.g inference type, parallelism), please refer to the documentation page .
🤝 Contributing
Full guide at CONTRIBUTING.md ;
Get the Code
git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby
If you have already cloned the repository, you could run the git submodule update --recursive --init
command to fetch all submodules.
Build
Set up the Rust environment by following this tutorial .
Install the required dependencies:
brew install protobuf
apt install protobuf-compiler libopenblas-dev
Install useful tools:
apt install make sqlite3 graphviz
Now, you can build Tabby by running the command cargo build
.
Start Hacking!
... and don't forget to submit a Pull Request
🌍 Community
🎤 Twitter / X - engage with TabbyML for all things possible
📚 LinkedIn - follow for the latest from the community
💌 Newsletter - subscribe to unlock Tabby insights and secrets
🔆 Activity
🌟 Star History