diff --git a/docs/learn/llm_app_architectures.mdx b/docs/learn/llm_app_architectures.mdx
index 2d7f75316b..f67b42ca76 100644
--- a/docs/learn/llm_app_architectures.mdx
+++ b/docs/learn/llm_app_architectures.mdx
@@ -10,7 +10,7 @@ There are multitude of architectures or pipelines for LLM applications. We discu
This architecture is the simplest. The LLM application is a simple wrapper around one prompt / LLM call.
-In agenta you can [create such LLM apps from the UI](/quickstart/getting-started-ui). You can use your own code in case that your model is not supported (or you would like to add some custom logic for pre-processing or post-processing the inputs).
+In agenta you can [create such LLM apps from the UI](/quickstart/getting-started-ui). Or you can [use your own code](quickstart/getting-started-code) in case that your [model is not supported](tutorials/deploy-mistral-model) (or you would like to add some custom logic for pre-processing or post-processing the inputs).