Skip to content

Commit

Permalink
Merge pull request #269 from MichaelClifford/readme
Browse files Browse the repository at this point in the history
update README
  • Loading branch information
rhatdan authored Apr 15, 2024
2 parents 6413cc1 + 01e9a59 commit 21abf00
Showing 1 changed file with 18 additions and 20 deletions.
38 changes: 18 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,35 +1,38 @@
# AI Lab Recipes

This repo contains recipes for building and running containerized AI and LLM
Applications locally with Podman.
Applications with Podman.

These containerized AI recipes can be used to help developers quickly prototype
new AI and LLM based applications, without the need for relying on any other
new AI and LLM based applications locally, without the need for relying on any other
externally hosted services. Since they are already containerized, it also helps
developers move quickly from prototype to production.

## Model services

[model servers examples](./model_servers)
## Model servers

#### What's a model server?

A model server is a program that serves machine-learning models or LLMs and
makes their functions available via API so that applications can incorporate
AI. This repository provides descriptions and files for building several model
servers.
A model server is a program that serves machine-learning models, such as LLMs, and
makes their functions available via an API. This makes it easy for developers to
incorporate AI into their applications. This repository provides descriptions and
code for building several of these model servers.

Many of the sample applications rely on the `llamacpp_python` model server by
default. This server can be used for various applications with various models.
default. This server can be used for various generative AI applications with various models.
However, each sample application can be paired with a variety of model servers.

Learn how to build and run the llamacpp_python model server by following the
[llamacpp_python model server README.](/model_servers/llamacpp_python/README.md).

## Current Recipes

There are several sample applications in this repository. They live in the
[recipes](./recipes) folder.
Recipes consist to at least two components: A model server and an AI application.
The model server manages the model, and the AI application provides the specific
logic needed to perform some specific task such as chat, summarization, object
detection, etc.

There are several sample applications in this repository that can be found in the
[recipes](./recipes) directory.

They fall under the categories:

Expand All @@ -39,15 +42,10 @@ They fall under the categories:
* [natural language processing](./recipes/natural_language_processing)


Most of the sample applications follow a similar pattern that includes a
model-server and an inference application. Many sample applications utilize the
[Streamlit UI](https://docs.streamlit.io/).

Learn how to build and run each application by visiting each of the categories
above. For example
the [chatbot recipe](./recipes/natural_language_processing/chatbot).
Learn how to build and run each application by visiting their README's.
For example, learn how to run the [chatbot recipe here](./recipes/natural_language_processing/chatbot).

## Current Locallm Images built from this repository
## Current AI Lab Recipe images built from this repository

Images for many sample applications and models are available in `quay.io`. All
currently built images are tracked in
Expand Down

0 comments on commit 21abf00

Please sign in to comment.