From cae28866917525890ccba68dc32b7fea5a0922c1 Mon Sep 17 00:00:00 2001 From: Xiao-Ran Zhou Date: Fri, 22 Dec 2023 10:53:05 +0100 Subject: [PATCH 1/5] Update wasm.md --- docs/wasm.md | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/docs/wasm.md b/docs/wasm.md index 7a2f38ae..ab6bb7a4 100644 --- a/docs/wasm.md +++ b/docs/wasm.md @@ -1,3 +1,11 @@ # LLM in your Browser - WebAssembly -Coming soon. \ No newline at end of file +Thanks to the new web assembly (WASM) technology and webGPU support of Chrome, it is now possible to run LLM in the local browser. +The newest deployment of biochatter (chatGSE-next) has also offer a WASM option. The following steps need to be taken to make the LLM run in your local docker set up. +1. `git clone https://github.com/xiaoranzhou/chatgse-next` (change to the biocypher repo after merge) +2. `git lfs install` +`git clone https://huggingface.co/zxrzxr/zephyr-7b-beta-chatRDM-q4f32_1/chatgse-next/chatgse/public/mistral` +3. `docker-compose -f chatgse/docker-compose.yml up -d` +4. Open http://localhost:5000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet) +5. Wait for loading of the LLM model, around 3-5 minutes. +6. Write questions in the chat input and click the send button behind the normal send button. From 7483af2134b9f7c48e5b97a3d470b56c3a2b23b9 Mon Sep 17 00:00:00 2001 From: Xiao-Ran Zhou Date: Fri, 22 Dec 2023 10:59:16 +0100 Subject: [PATCH 2/5] Update wasm.md --- docs/wasm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/wasm.md b/docs/wasm.md index ab6bb7a4..a8f9820d 100644 --- a/docs/wasm.md +++ b/docs/wasm.md @@ -6,6 +6,6 @@ The newest deployment of biochatter (chatGSE-next) has also offer a WASM option. 2. `git lfs install` `git clone https://huggingface.co/zxrzxr/zephyr-7b-beta-chatRDM-q4f32_1/chatgse-next/chatgse/public/mistral` 3. `docker-compose -f chatgse/docker-compose.yml up -d` -4. Open http://localhost:5000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet) +4. Open http://localhost:3000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet) 5. Wait for loading of the LLM model, around 3-5 minutes. 6. Write questions in the chat input and click the send button behind the normal send button. From effe786c4247e9228a73329c7938a9201dfdc56e Mon Sep 17 00:00:00 2001 From: Xiao-Ran Zhou Date: Fri, 22 Dec 2023 11:01:58 +0100 Subject: [PATCH 3/5] Update wasm.md --- docs/wasm.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/docs/wasm.md b/docs/wasm.md index a8f9820d..89d19f2c 100644 --- a/docs/wasm.md +++ b/docs/wasm.md @@ -7,5 +7,7 @@ The newest deployment of biochatter (chatGSE-next) has also offer a WASM option. `git clone https://huggingface.co/zxrzxr/zephyr-7b-beta-chatRDM-q4f32_1/chatgse-next/chatgse/public/mistral` 3. `docker-compose -f chatgse/docker-compose.yml up -d` 4. Open http://localhost:3000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet) -5. Wait for loading of the LLM model, around 3-5 minutes. -6. Write questions in the chat input and click the send button behind the normal send button. +5. Wait for loading of the LLM model, around 3-5 minutes. You might need to refresh the webpage until you see the "mistral" text (red circle): + ![image](https://github.com/xiaoranzhou/biochatter/assets/29843510/684c735c-5d92-4cbe-9825-eb9eeec43bef) + +7. Write questions in the chat input and click the send button behind the normal send button (blue circle). From a116cfa3576555003c76de135f1ed8fda5848bf8 Mon Sep 17 00:00:00 2001 From: Xiao-Ran Zhou Date: Tue, 2 Jan 2024 18:26:02 +0100 Subject: [PATCH 4/5] Update wasm.md --- docs/wasm.md | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/wasm.md b/docs/wasm.md index 89d19f2c..b6fa3367 100644 --- a/docs/wasm.md +++ b/docs/wasm.md @@ -4,10 +4,11 @@ Thanks to the new web assembly (WASM) technology and webGPU support of Chrome, i The newest deployment of biochatter (chatGSE-next) has also offer a WASM option. The following steps need to be taken to make the LLM run in your local docker set up. 1. `git clone https://github.com/xiaoranzhou/chatgse-next` (change to the biocypher repo after merge) 2. `git lfs install` -`git clone https://huggingface.co/zxrzxr/zephyr-7b-beta-chatRDM-q4f32_1/chatgse-next/chatgse/public/mistral` -3. `docker-compose -f chatgse/docker-compose.yml up -d` -4. Open http://localhost:3000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet) -5. Wait for loading of the LLM model, around 3-5 minutes. You might need to refresh the webpage until you see the "mistral" text (red circle): +`git clone https://huggingface.co/zxrzxr/zephyr-7b-beta-chatRDM-q4f32_1/ chatgse-next/chatgse/public/mistral` +3. `cd chatgse-next` +4. `docker-compose -f chatgse/docker-compose.yml up -d` +5. Open http://localhost:3000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet) +6. Wait for loading of the LLM model, around 3-5 minutes. You might need to refresh the webpage until you see the "mistral" text (red circle): ![image](https://github.com/xiaoranzhou/biochatter/assets/29843510/684c735c-5d92-4cbe-9825-eb9eeec43bef) 7. Write questions in the chat input and click the send button behind the normal send button (blue circle). From 6979ffcb2720dd6f65963886932e290b6b2cebdf Mon Sep 17 00:00:00 2001 From: Xiao-Ran Zhou Date: Tue, 2 Jan 2024 18:36:17 +0100 Subject: [PATCH 5/5] Update wasm.md --- docs/wasm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/wasm.md b/docs/wasm.md index b6fa3367..9c5b7060 100644 --- a/docs/wasm.md +++ b/docs/wasm.md @@ -4,7 +4,7 @@ Thanks to the new web assembly (WASM) technology and webGPU support of Chrome, i The newest deployment of biochatter (chatGSE-next) has also offer a WASM option. The following steps need to be taken to make the LLM run in your local docker set up. 1. `git clone https://github.com/xiaoranzhou/chatgse-next` (change to the biocypher repo after merge) 2. `git lfs install` -`git clone https://huggingface.co/zxrzxr/zephyr-7b-beta-chatRDM-q4f32_1/ chatgse-next/chatgse/public/mistral` +`git clone https://huggingface.co/zxrzxr/Mistral-7B-Instruct-v0.1-q4f32_1 chatgse-next/chatgse/public/mistral` 3. `cd chatgse-next` 4. `docker-compose -f chatgse/docker-compose.yml up -d` 5. Open http://localhost:3000/#/webllm in **CHROME** (very important, other browser does not support webGPU yet)