Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github May 8, 2024

Bumps llama-cpp-python from 0.2.57 to 0.2.70.

Changelog

Sourced from llama-cpp-python's changelog.

[0.2.70]

  • feat: Update llama.cpp to ggerganov/llama.cpp@
  • feat: fill-in-middle support by @​CISC in #1386
  • fix: adding missing args in create_completion for functionary chat handler by @​skalade in #1430
  • docs: update README.md @​eltociear in #1432
  • fix: chat_format log where auto-detected format prints None by @​balvisio in #1434
  • feat(server): Add support for setting root_path by @​abetlen in 0318702cdc860999ee70f277425edbbfe0e60419
  • feat(ci): Add docker checks and check deps more frequently by @​Smartappli in #1426
  • fix: detokenization case where first token does not start with a leading space by @​noamgat in #1375
  • feat: Implement streaming for Functionary v2 + Bug fixes by @​jeffrey-fong in #1419
  • fix: Use memmove to copy str_value kv_override by @​abetlen in 9f7a85571ae80d3b6ddbd3e1bae407b9f1e3448a
  • feat(server): Remove temperature bounds checks for server by @​abetlen in 0a454bebe67d12a446981eb16028c168ca5faa81
  • fix(server): Propagate flash_attn to model load by @​dthuerck in #1424

[0.2.69]

  • feat: Update llama.cpp to ggml-org/llama.cpp@6ecf318
  • feat: Add llama-3-vision-alpha chat format by @​abetlen in 31b1d95a6c19f5b615a3286069f181a415f872e8
  • fix: Change default verbose value of verbose in image chat format handlers to True to match Llama by @​abetlen in 4f01c452b6c738dc56eacac3758119b12c57ea94
  • fix: Suppress all logs when verbose=False, use hardcoded fileno's to work in colab notebooks by @​abetlen in f116175a5a7c84569c88cad231855c1e6e59ff6e
  • fix: UTF-8 handling with grammars by @​jsoma in #1415

[0.2.68]

[0.2.67]

  • fix: Ensure image renders before text in chat formats regardless of message content order by @​abetlen in 3489ef09d3775f4a87fb7114f619e8ba9cb6b656
  • fix(ci): Fix bug in use of upload-artifact failing to merge multiple artifacts into a single release by @​abetlen in d03f15bb73a1d520970357b702a9e7d4cc2a7a62

[0.2.66]

[0.2.65]

... (truncated)

Commits
  • 9ce5cb3 chore: Bump version
  • 4a7122d feat: fill-in-middle support (#1386)
  • 228949c feat: Update llama.cpp
  • 903b28a fix: adding missing args in create_completion for functionary chat handler (#...
  • 07966b9 docs: update README.md (#1432)
  • a50d24e fix: chat_format log where auto-detected format prints None (#1434)
  • 0318702 feat(server): Add support for setting root_path. Closes #1420
  • 3666833 feat(ci): Add docker checks and check deps more frequently (#1426)
  • 3e2597e feat: Update llama.cpp
  • e0d7674 fix: detokenization case where first token does not start with a leading spac...
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) from 0.2.57 to 0.2.70.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Changelog](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md)
- [Commits](abetlen/llama-cpp-python@v0.2.57...v0.2.70)

---
updated-dependencies:
- dependency-name: llama-cpp-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels May 8, 2024
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github May 10, 2024

Superseded by #101.

@dependabot dependabot bot closed this May 10, 2024
@dependabot dependabot bot deleted the dependabot/pip/llama-cpp-python-0.2.70 branch May 10, 2024 16:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update Python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant