Skip to content

Commit

Permalink
[fix] model key typo, update version
Browse files Browse the repository at this point in the history
  • Loading branch information
pkelaita committed Aug 6, 2024
1 parent 64b32fd commit c28dd38
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 7 deletions.
6 changes: 4 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,19 @@
# Changelog

_Current version: 0.0.30_
_Current version: 0.0.32_

[PyPi link](https://pypi.org/project/l2m2/)

### 0.0.30 - August 5, 2024
### 0.0.32 - August 5, 2024

#### Added

- [Mistral](https://mistral.ai/) provider support via La Plateforme.
- [Mistral Large 2](https://mistral.ai/news/mistral-large-2407/) model availibility from Mistral.
- Mistral 7B, Mixtral 8x7B, and Mixtral 8x22B model availibility from Mistral in addition to existing providers.

- _0.0.30 and 0.0.31 are skipped due to a packaging error and a model key typo._

### 0.0.29 - August 4, 2024

> [!CAUTION]
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# L2M2: A Simple Python LLM Manager 💬👍

[![Tests](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml/badge.svg?timestamp=1722903983)](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml) [![codecov](https://codecov.io/github/pkelaita/l2m2/graph/badge.svg?token=UWIB0L9PR8)](https://codecov.io/github/pkelaita/l2m2) [![PyPI version](https://badge.fury.io/py/l2m2.svg?timestamp=1722903983)](https://badge.fury.io/py/l2m2)
[![Tests](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml/badge.svg?timestamp=1722907074)](https://github.com/pkelaita/l2m2/actions/workflows/tests.yml) [![codecov](https://codecov.io/github/pkelaita/l2m2/graph/badge.svg?token=UWIB0L9PR8)](https://codecov.io/github/pkelaita/l2m2) [![PyPI version](https://badge.fury.io/py/l2m2.svg?timestamp=1722907074)](https://badge.fury.io/py/l2m2)

**L2M2** ("LLM Manager" → "LLMM" → "L2M2") is a tiny and very simple LLM manager for Python that exposes lots of models through a unified API. This is useful for evaluation, demos, production applications etc. that need to easily be model-agnostic.

Expand Down Expand Up @@ -37,7 +37,7 @@ L2M2 currently supports the following models:
| `claude-3-haiku` | [Anthropic](https://www.anthropic.com/api) | `claude-3-haiku-20240307` |
| `command-r` | [Cohere](https://docs.cohere.com/) | `command-r` |
| `command-r-plus` | [Cohere](https://docs.cohere.com/) | `command-r-plus` |
| `mixtral-large-2` | [Mistral](https://mistral.ai/) | `mistral-large-latest` |
| `mistral-large-2` | [Mistral](https://mistral.ai/) | `mistral-large-latest` |
| `mixtral-8x22b` | [Mistral](https://mistral.ai/), [OctoAI](https://octoai.cloud/) | `open-mixtral-8x22b`, `mixtral-8x22b-instruct` |
| `mixtral-8x7b` | [Mistral](https://mistral.ai/), [OctoAI](https://octoai.cloud/), [Groq](https://wow.groq.com/) | `open-mixtral-8x7b`, `mixtral-8x7b-instruct`, `mixtral-8x7b-32768` |
| `mistral-7b` | [Mistral](https://mistral.ai/), [OctoAI](https://octoai.cloud/) | `open-mistral-7b`, `mistral-7b-instruct` |
Expand Down Expand Up @@ -508,7 +508,7 @@ The following models natively support JSON mode via the given provider:
- `gpt-4-turbo` (via Openai)
- `gpt-3.5-turbo` (via Openai)
- `gemini-1.5-pro` (via Google)
- `mixtral-large-2` (via Mistral)
- `mistral-large-2` (via Mistral)
- `mixtral-8x22b` (via Mistral)
- `mixtral-8x7b` (via Mistral)
- `mistral-7b` (via Mistral)
Expand Down
2 changes: 1 addition & 1 deletion l2m2/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.0.30"
__version__ = "0.0.32"
2 changes: 1 addition & 1 deletion l2m2/model_info.py
Original file line number Diff line number Diff line change
Expand Up @@ -319,7 +319,7 @@ class ModelEntry(TypedDict):
"extras": {},
},
},
"mixtral-large-2": {
"mistral-large-2": {
"mistral": {
"model_id": "mistral-large-latest",
"params": {
Expand Down

0 comments on commit c28dd38

Please sign in to comment.