Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github May 27, 2024

Bumps llama-cpp-python from 0.2.57 to 0.2.76.

Changelog

Sourced from llama-cpp-python's changelog.

[0.2.76]

[0.2.75]

[0.2.74]

[0.2.73]

  • feat: Update llama.cpp to ggml-org/llama.cpp@25c6e82
  • fix: Clear kv cache at beginning of image chat formats to avoid bug when image is evaluated first by @​abetlen in ac55d0a175115d1e719672ce1cb1bec776c738b1

[0.2.72]

  • fix(security): Remote Code Execution by Server-Side Template Injection in Model Metadata by @​retr0reg in b454f40a9a1787b2b5659cd2cb00819d983185df
  • fix(security): Update remaining jinja chat templates to use immutable sandbox by @​CISC in #1441

[0.2.71]

  • feat: Update llama.cpp to ggml-org/llama.cpp@911b390
  • fix: Make leading bos_token optional for image chat formats, fix nanollava system message by @​abetlen in 77122638b4153e31d9f277b3d905c2900b536632
  • fix: free last image embed in llava chat handler by @​abetlen in 3757328b703b2cd32dcbd5853271e3a8c8599fe7

[0.2.70]

  • feat: Update llama.cpp to ggml-org/llama.cpp@c0e6fbf
  • feat: fill-in-middle support by @​CISC in #1386
  • fix: adding missing args in create_completion for functionary chat handler by @​skalade in #1430
  • docs: update README.md @​eltociear in #1432
  • fix: chat_format log where auto-detected format prints None by @​balvisio in #1434
  • feat(server): Add support for setting root_path by @​abetlen in 0318702cdc860999ee70f277425edbbfe0e60419
  • feat(ci): Add docker checks and check deps more frequently by @​Smartappli in #1426
  • fix: detokenization case where first token does not start with a leading space by @​noamgat in #1375
  • feat: Implement streaming for Functionary v2 + Bug fixes by @​jeffrey-fong in #1419
  • fix: Use memmove to copy str_value kv_override by @​abetlen in 9f7a85571ae80d3b6ddbd3e1bae407b9f1e3448a
  • feat(server): Remove temperature bounds checks for server by @​abetlen in 0a454bebe67d12a446981eb16028c168ca5faa81
  • fix(server): Propagate flash_attn to model load by @​dthuerck in #1424

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) from 0.2.57 to 0.2.76.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Changelog](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md)
- [Commits](abetlen/llama-cpp-python@v0.2.57...v0.2.76)

---
updated-dependencies:
- dependency-name: llama-cpp-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels May 27, 2024
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Jun 6, 2024

Superseded by #107.

@dependabot dependabot bot closed this Jun 6, 2024
@dependabot dependabot bot deleted the dependabot/pip/llama-cpp-python-0.2.76 branch June 6, 2024 17:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update Python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant