diff --git a/CHANGELOG.md b/CHANGELOG.md index 4e2d3d3..5196add 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,9 +1,33 @@ # Changelog -_Current version: 0.0.33_ +_Current version: 0.0.34_ [PyPi link](https://pypi.org/project/l2m2/) +### 0.0.34 - September 30, 2024 + +> [!CAUTION] +> This release has breaking changes! Please read the changelog carefully. + +#### Added + +- New supported models `gemma-2-9b`, `llama-3.2-1b`, and `llama-3.2-3b` via Groq. + +#### Changed + +- In order to be more consistent with l2m2's naming scheme, the following model ids have been updated: + - `llama3-8b` → `llama-3-8b` + - `llama3-70b` → `llama-3-70b` + - `llama3.1-8b` → `llama-3.1-8b` + - `llama3.1-70b` → `llama-3.1-70b` + - `llama3.1-405b` → `llama-3.1-405b` +- **This is a breaking change!!!** Calls using the old `model_id`s (`llama3-8b`, etc.) will fail. + +#### Removed + +- Provider `octoai` has been removed as they have [been acquired](https://www.geekwire.com/2024/chip-giant-nvidia-acquires-octoai-a-seattle-startup-that-helps-companies-run-ai-models/) and are shutting down their cloud platform. **This is a breaking change!!!** Calls using the `octoai` provider will fail. + - All previous OctoAI supported models (`mixtral-8x22b`, `mixtral-8x7b`, `mistral-7b`, `llama-3-70b`, `llama-3.1-8b`, `llama-3.1-70b`, and `llama-3.1-405b`) are still available via Mistral, Groq, and/or Replicate. + ### 0.0.33 - September 11, 2024 #### Changed diff --git a/README.md b/README.md index 166731e..a04cb03 100644 --- a/README.md +++ b/README.md @@ -1,12 +1,12 @@ # L2M2: A Simple Python LLM Manager 💬👍 -[![Tests](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml/badge.svg?timestamp=1726092692)](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml) [![codecov](https://codecov.io/github/pkelaita/l2m2/graph/badge.svg?token=UWIB0L9PR8)](https://codecov.io/github/pkelaita/l2m2) [![PyPI version](https://badge.fury.io/py/l2m2.svg?timestamp=1726092692)](https://badge.fury.io/py/l2m2) +[![Tests](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml/badge.svg?timestamp=1727732688)](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml) [![codecov](https://codecov.io/github/pkelaita/l2m2/graph/badge.svg?token=UWIB0L9PR8)](https://codecov.io/github/pkelaita/l2m2) [![PyPI version](https://badge.fury.io/py/l2m2.svg?timestamp=1727732688)](https://badge.fury.io/py/l2m2) **L2M2** ("LLM Manager" → "LLMM" → "L2M2") is a tiny and very simple LLM manager for Python that exposes lots of models through a unified API. This is useful for evaluation, demos, production applications etc. that need to easily be model-agnostic. ### Features -- 22 supported models (see below) – regularly updated and with more on the way. +- 25 supported models (see below) – regularly updated and with more on the way. - Session chat memory – even across multiple models or with concurrent memory streams. - JSON mode - Prompt loading tools @@ -23,30 +23,33 @@ L2M2 currently supports the following models: -| Model Name | Provider(s) | Model Version(s) | -| ------------------- | --------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------- | -| `gpt-4o` | [OpenAI](https://openai.com/product) | `gpt-4o-2024-08-06` | -| `gpt-4o-mini` | [OpenAI](https://openai.com/product) | `gpt-4o-mini-2024-07-18` | -| `gpt-4-turbo` | [OpenAI](https://openai.com/product) | `gpt-4-turbo-2024-04-09` | -| `gpt-3.5-turbo` | [OpenAI](https://openai.com/product) | `gpt-3.5-turbo-0125` | -| `gemini-1.5-pro` | [Google](https://ai.google.dev/) | `gemini-1.5-pro` | -| `gemini-1.0-pro` | [Google](https://ai.google.dev/) | `gemini-1.0-pro` | -| `claude-3.5-sonnet` | [Anthropic](https://www.anthropic.com/api) | `claude-3-5-sonnet-20240620` | -| `claude-3-opus` | [Anthropic](https://www.anthropic.com/api) | `claude-3-opus-20240229` | -| `claude-3-sonnet` | [Anthropic](https://www.anthropic.com/api) | `claude-3-sonnet-20240229` | -| `claude-3-haiku` | [Anthropic](https://www.anthropic.com/api) | `claude-3-haiku-20240307` | -| `command-r` | [Cohere](https://docs.cohere.com/) | `command-r` | -| `command-r-plus` | [Cohere](https://docs.cohere.com/) | `command-r-plus` | -| `mistral-large-2` | [Mistral](https://mistral.ai/) | `mistral-large-latest` | -| `mixtral-8x22b` | [Mistral](https://mistral.ai/), [OctoAI](https://octoai.cloud/) | `open-mixtral-8x22b`, `mixtral-8x22b-instruct` | -| `mixtral-8x7b` | [Mistral](https://mistral.ai/), [OctoAI](https://octoai.cloud/), [Groq](https://wow.groq.com/) | `open-mixtral-8x7b`, `mixtral-8x7b-instruct`, `mixtral-8x7b-32768` | -| `mistral-7b` | [Mistral](https://mistral.ai/), [OctoAI](https://octoai.cloud/) | `open-mistral-7b`, `mistral-7b-instruct` | -| `gemma-7b` | [Groq](https://wow.groq.com/) | `gemma-7b-it` , | -| `llama-3-8b` | [Groq](https://wow.groq.com/), [Replicate](https://replicate.com/) | `llama-3-8b-8192`, `meta/meta-llama-3-8b-instruct` | -| `llama-3-70b` | [Groq](https://wow.groq.com/), [Replicate](https://replicate.com/), [OctoAI](https://octoai.cloud/) | `llama-3-70b-8192`, `meta/meta-llama-3-70b-instruct`, `meta-llama-3-70b-instruct` | -| `llama-3.1-8b` | [OctoAI](https://octoai.cloud/) | `meta-llama-3.1-8b-instruct` | -| `llama-3.1-70b` | [OctoAI](https://octoai.cloud/) | `meta-llama-3.1-70b-instruct` | -| `llama-3.1-405b` | [Replicate](https://replicate.com/), [OctoAI](https://octoai.cloud/) | `meta/meta-llama-3.1-405b-instruct`, `meta-llama-3.1-405b-instruct` | +| Model Name | Provider(s) | Model Version(s) | +| ------------------- | ------------------------------------------------------------------ | --------------------------------------------------- | +| `gpt-4o` | [OpenAI](https://openai.com/product) | `gpt-4o-2024-08-06` | +| `gpt-4o-mini` | [OpenAI](https://openai.com/product) | `gpt-4o-mini-2024-07-18` | +| `gpt-4-turbo` | [OpenAI](https://openai.com/product) | `gpt-4-turbo-2024-04-09` | +| `gpt-3.5-turbo` | [OpenAI](https://openai.com/product) | `gpt-3.5-turbo-0125` | +| `gemini-1.5-pro` | [Google](https://ai.google.dev/) | `gemini-1.5-pro` | +| `gemini-1.0-pro` | [Google](https://ai.google.dev/) | `gemini-1.0-pro` | +| `claude-3.5-sonnet` | [Anthropic](https://www.anthropic.com/api) | `claude-3-5-sonnet-20240620` | +| `claude-3-opus` | [Anthropic](https://www.anthropic.com/api) | `claude-3-opus-20240229` | +| `claude-3-sonnet` | [Anthropic](https://www.anthropic.com/api) | `claude-3-sonnet-20240229` | +| `claude-3-haiku` | [Anthropic](https://www.anthropic.com/api) | `claude-3-haiku-20240307` | +| `command-r` | [Cohere](https://docs.cohere.com/) | `command-r` | +| `command-r-plus` | [Cohere](https://docs.cohere.com/) | `command-r-plus` | +| `mistral-large-2` | [Mistral](https://mistral.ai/) | `mistral-large-latest` | +| `mixtral-8x22b` | [Mistral](https://mistral.ai/) | `open-mixtral-8x22b` | +| `mixtral-8x7b` | [Mistral](https://mistral.ai/), [Groq](https://wow.groq.com/) | `open-mixtral-8x7b`, `mixtral-8x7b-32768` | +| `mistral-7b` | [Mistral](https://mistral.ai/) | `open-mistral-7b` | +| `gemma-7b` | [Groq](https://wow.groq.com/) | `gemma-7b-it` | +| `gemma-2-9b` | [Groq](https://wow.groq.com/) | `gemma2-9b-it` | +| `llama-3-8b` | [Groq](https://wow.groq.com/), [Replicate](https://replicate.com/) | `llama3-8b-8192`, `meta/meta-llama-3-8b-instruct` | +| `llama-3-70b` | [Groq](https://wow.groq.com/), [Replicate](https://replicate.com/) | `llama3-70b-8192`, `meta/meta-llama-3-70b-instruct` | +| `llama-3.1-8b` | [Groq](https://wow.groq.com/) | `llama-3.1-8b-instant` | +| `llama-3.1-70b` | [Groq](https://wow.groq.com/) | `llama-3.1-70b-versatile` | +| `llama-3.1-405b` | [Replicate](https://replicate.com/) | `meta/meta-llama-3.1-405b-instruct` | +| `llama-3.2-1b` | [Groq](https://wow.groq.com/) | `llama-3.2-1b-preview` | +| `llama-3.2-3b` | [Groq](https://wow.groq.com/) | `llama-3.2-3b-preview` | @@ -101,7 +104,6 @@ To activate any of the providers, set the provider's API key in the correspondin | Google | `GOOGLE_API_KEY` | | Groq | `GROQ_API_KEY` | | Replicate | `REPLICATE_API_TOKEN` | -| OctoAI | `OCTOAI_TOKEN` | | Mistral (La Plateforme) | `MISTRAL_API_KEY` | Additionally, you can activate providers programmatically as follows: diff --git a/l2m2/__init__.py b/l2m2/__init__.py index 667c8b5..9621eef 100644 --- a/l2m2/__init__.py +++ b/l2m2/__init__.py @@ -1 +1 @@ -__version__ = "0.0.33" +__version__ = "0.0.34"