From e64e8b7a8942b905c91cfa428ed81dd59841b3ed Mon Sep 17 00:00:00 2001 From: Mahmoud Mabrouk Date: Wed, 6 Dec 2023 18:58:32 +0100 Subject: [PATCH] Documentation edits made through Mintlify web editor --- docs/learn/llm_app_architectures.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/learn/llm_app_architectures.mdx b/docs/learn/llm_app_architectures.mdx index 2d7f75316b..f67b42ca76 100644 --- a/docs/learn/llm_app_architectures.mdx +++ b/docs/learn/llm_app_architectures.mdx @@ -10,7 +10,7 @@ There are multitude of architectures or pipelines for LLM applications. We discu This architecture is the simplest. The LLM application is a simple wrapper around one prompt / LLM call. -In agenta you can [create such LLM apps from the UI](/quickstart/getting-started-ui). You can use your own code in case that your model is not supported (or you would like to add some custom logic for pre-processing or post-processing the inputs). +In agenta you can [create such LLM apps from the UI](/quickstart/getting-started-ui). Or you can [use your own code](quickstart/getting-started-code) in case that your [model is not supported](tutorials/deploy-mistral-model) (or you would like to add some custom logic for pre-processing or post-processing the inputs).