diff --git a/docs/src/pages/integrations/coding/_assets/tabby-answer-engine.png b/docs/src/pages/integrations/coding/_assets/tabby-answer-engine.png new file mode 100644 index 0000000000..420b609fcb Binary files /dev/null and b/docs/src/pages/integrations/coding/_assets/tabby-answer-engine.png differ diff --git a/docs/src/pages/integrations/coding/_assets/tabby-chat-sidebar.png b/docs/src/pages/integrations/coding/_assets/tabby-chat-sidebar.png new file mode 100644 index 0000000000..50cfbd226b Binary files /dev/null and b/docs/src/pages/integrations/coding/_assets/tabby-chat-sidebar.png differ diff --git a/docs/src/pages/integrations/coding/tabby.mdx b/docs/src/pages/integrations/coding/tabby.mdx new file mode 100644 index 0000000000..6647b4eb46 --- /dev/null +++ b/docs/src/pages/integrations/coding/tabby.mdx @@ -0,0 +1,104 @@ +--- +title: Tabby +description: A step-by-step guide on integrating Jan with Tabby and VSCode, JetBrains, or other IDEs. +keywords: + [ + Jan, + Customizable Intelligence, LLM, + local AI, + privacy focus, + free and open source, + private and offline, + conversational AI, + no-subscription fee, + large language models, + Tabby integration, + VSCode integration, + JetBrains integration, + ] +--- + +import { Tabs, Steps } from 'nextra/components' + +# Tabby + +## Integrate Jan with Tabby and Your Favorite IDEs + +[Tabby](https://www.tabbyml.com/) is an open-source, self-hosted AI coding assistant. +With Tabby, teams can easily set up their own LLM-powered code completion server. + +Tabby provides integrations with VSCode, JetBrains, and other IDEs to help developers code more efficiently, +and it can be used with various LLM services, including Jan. + +To integrate Jan with Tabby, follow these steps: + + + +### Step 1: Enable the Jan API Server + +To set up Tabby with Jan's Local Server, you must activate the Jan API Server with your chosen model. + +1. Click the `Local API Server` (`<>`) button above the Settings. Jan will direct you to the **Local API Server** section. +2. Configure the server, including the **IP Port**, **Cross-Origin Resource Sharing (CORS)**, and **Verbose Server Logs**. +3. Press the **Start Server** button. + +### Step 2: Find the Model ID and Ensure the Model is Activated + +1. Go to `Settings` > `My Models`. +2. Models are listed with their **Model ID** beneath their names. +3. Click the **three dots (⋮)** button next to the model. +4. Select **Start Model** to activate the model. + +### Step 3: Installing Tabby Server + +Use the following documentation to install the Tabby server: +- [Docker](https://tabby.tabbyml.com/docs/quick-start/installation/docker/) +- [Apple Silicon](https://tabby.tabbyml.com/docs/quick-start/installation/apple/) +- [Linux](https://tabby.tabbyml.com/docs/quick-start/installation/linux/) +- [Windows](https://tabby.tabbyml.com/docs/quick-start/installation/windows/) + +Then, follow the steps to connect Jan with the Tabby server: +[Connect Jan with Tabby](https://tabby.tabbyml.com/docs/references/models-http-api/jan.ai/). + +For example, to connect Jan with Tabby, save the following configuration under `~/.tabby/config.toml`: + +```toml title="~/.tabby/config.toml" +# Chat model +[model.chat.http] +kind = "openai/chat" +model_name = "model_id" +api_endpoint = "http://localhost:1337/v1" +api_key = "" +``` + +Currently, the Jan completion and embedding API is under construction. +Once completed, you can also connect Jan with Tabby for completion and embedding tasks. + +### Step 4: Installing Tabby on Your Favorite IDEs + +Refer to the following documentation to install the Tabby extension on your favorite IDEs: +- [Visual Studio Code](https://tabby.tabbyml.com/docs/extensions/installation/vscode/) +- [JetBrains IntelliJ Platform](https://tabby.tabbyml.com/docs/extensions/installation/intellij/) +- [VIM / NeoVIM](https://tabby.tabbyml.com/docs/extensions/installation/vim/) + + + +## How to Use Tabby with Jan Integration + +### Answer Engine: Chat with Your Codes and Documentation + +Tabby offers an [Answer Engine](https://tabby.tabbyml.com/docs/administration/answer-engine/) on the homepage, +which can leverage the Jan LLM and related contexts like code, documentation, and web pages to answer user questions. + +Simply open the Tabby homepage at [localhost:8080](http://localhost:8080) and ask your questions. + +![Answer Engine](./_assets/tabby-answer-engine.png) + +### IDE Chat Sidebar + +After installing the Tabby extension on your preferred IDEs, you can engage in a conversation with Jan to: + +1. Discuss your code, receive suggestions, and seek assistance. +2. Request Jan to inline edit your code, and then review and accept the proposed changes. + +![Chat Sidebar](./_assets/tabby-chat-sidebar.png) \ No newline at end of file