Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build(deps): Bump llama-cpp-python[server] from 0.2.79 to 0.2.90 #52

Closed

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Sep 19, 2024

Bumps llama-cpp-python[server] from 0.2.79 to 0.2.90.

Changelog

Sourced from llama-cpp-python[server]'s changelog.

[0.2.90]

[0.2.89]

[0.2.88]

[0.2.87]

  • feat: Update llama.cpp to ggerganov/llama.cpp@be55695
  • fix: Include all llama.cpp source files and subdirectories by @​abetlen in 9cad5714ae6e7c250af8d0bbb179f631368c928b
  • feat(ci): Re-build wheel index automatically when releases are created by @​abetlen in 198f47dc1bd202fd2b71b29e041a9f33fe40bfad

[0.2.86]

[0.2.85]

[0.2.84]

[0.2.83]

... (truncated)

Commits
  • 077ecb6 chore: Bump version
  • 332720d feat: Update llama.cpp
  • ad2deaf docs: Add MiniCPM-V-2.6 to multi-modal model list
  • cbbfad4 docs: center icon and resize
  • b570fd3 docs: Add project icon courtesy of 🤗
  • 97d527e feat: Add server chat_format minicpm-v-2.6 for MiniCPMv26ChatHandler
  • c68e7fb fix: pull all gh releases for self-hosted python index
  • e251a0b fix: Update name to MiniCPMv26ChatHandler
  • f70df82 feat: Add MiniCPMv26 chat handler.
  • 82ae7f9 feat: Update llama.cpp
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.2.79 to 0.2.90.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Changelog](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md)
- [Commits](abetlen/llama-cpp-python@v0.2.79...v0.2.90)

---
updated-dependencies:
- dependency-name: llama-cpp-python[server]
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Sep 19, 2024
@dependabot dependabot bot requested a review from a team as a code owner September 19, 2024 00:46
@jeffmaury
Copy link
Contributor

Replaced by #53

@jeffmaury jeffmaury closed this Sep 24, 2024
Copy link
Contributor Author

dependabot bot commented on behalf of github Sep 24, 2024

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@dependabot dependabot bot deleted the dependabot/pip/llama-cpp-python-server--0.2.90 branch September 24, 2024 06:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant