diff --git a/docs/images/tutorial-deploy-mistral-model/deploy_dev_mistral.png b/docs/images/tutorial-deploy-mistral-model/deploy_dev_mistral.png
new file mode 100644
index 0000000000..0ebe1713c2
Binary files /dev/null and b/docs/images/tutorial-deploy-mistral-model/deploy_dev_mistral.png differ
diff --git a/docs/images/tutorial-deploy-mistral-model/endpoint_mistral.png b/docs/images/tutorial-deploy-mistral-model/endpoint_mistral.png
new file mode 100644
index 0000000000..87d36a0383
Binary files /dev/null and b/docs/images/tutorial-deploy-mistral-model/endpoint_mistral.png differ
diff --git a/docs/images/tutorial-deploy-mistral-model/evaluation_mistral.png b/docs/images/tutorial-deploy-mistral-model/evaluation_mistral.png
new file mode 100644
index 0000000000..f97235d72f
Binary files /dev/null and b/docs/images/tutorial-deploy-mistral-model/evaluation_mistral.png differ
diff --git a/docs/images/tutorial-deploy-mistral-model/playground_mistral.png b/docs/images/tutorial-deploy-mistral-model/playground_mistral.png
new file mode 100644
index 0000000000..ea8574b86c
Binary files /dev/null and b/docs/images/tutorial-deploy-mistral-model/playground_mistral.png differ
diff --git a/docs/mint.json b/docs/mint.json
index 7403a25fa5..424ec1f0ea 100644
--- a/docs/mint.json
+++ b/docs/mint.json
@@ -60,7 +60,8 @@
{
"group": "Tutorials",
"pages": [
- "tutorials/first-app-with-langchain"
+ "tutorials/first-app-with-langchain",
+ "tutorials/deploy-mistral-model"
]
},
{
diff --git a/docs/tutorials/deploy-mistral-model.mdx b/docs/tutorials/deploy-mistral-model.mdx
new file mode 100644
index 0000000000..7987ca7979
--- /dev/null
+++ b/docs/tutorials/deploy-mistral-model.mdx
@@ -0,0 +1,70 @@
+---
+title: Deploy Mistral-7B from Hugging Face
+---
+
+1. Create a [Hugging Face Account](https://huggingface.co/)
+
+2. Navigate to the [Mistral-7B-v0.1 model page](https://huggingface.co/mistralai/Mistral-7B-v0.1?text=My+name+is+Thomas+and+my+main)
+
+3. Click on the 'Deploy' option, and then select 'Interface API'
+
+4. Create an Access Token (If you haven't already)
+
+5. Initially, you might start with a simple Python script to interact with the model:
+
+```
+import requests
+
+API_URL = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-v0.1"
+headers = {"Authorization": "Bearer [Your_Token]"}
+
+def query(payload):
+ response = requests.post(API_URL, headers=headers, json=payload)
+ return response.json()
+
+output = query({"inputs": "Your query here"})
+```
+6. Modify the script and add the Agenta SDK
+
+```
+import agenta as ag
+import requests
+
+API_URL = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-v0.1"
+headers = {"Authorization": "Bearer [Your_Token]"}
+
+ag.init()
+ag.config.default(
+ prompt_template=ag.TextParam("Summarize the following text: {text}"),
+)
+
+@ag.entrypoint
+def generate(text: str) -> str:
+ prompt = ag.config.prompt_template.format(text=text)
+ payload = {"inputs": prompt}
+ response = requests.post(API_URL, headers=headers, json=payload)
+ return response.json()[0]["generated_text"]
+```
+
+7. Run these commands to deploy the app to agenta
+```
+agenta init
+agenta variant serve app.py
+```
+8. Now, Interact with your app either locally http://localhost or in cloud https://cloud.agenta.ai/apps
+
+
+
+9. You can create a test set and evaluate the model's performance using Agenta's evaluation techniques
+
+
+
+10. If you want to deploy the variant, navigate to the playground, Click on 'Publish' and choose the envirenment environment to which you wish to deploy
+
+
+
+11. Go to 'Endpoints' section, select the environment then use the provided endpoint to send requests to the LLM app
+
+
+
+