Skip to content

[tune](deps): Bump transformers from 4.8.1 to 4.21.2 in /python/requirements/tune#94

Closed
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/python/requirements/tune/transformers-4.21.2
Closed

[tune](deps): Bump transformers from 4.8.1 to 4.21.2 in /python/requirements/tune#94
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/python/requirements/tune/transformers-4.21.2

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Aug 27, 2022

Bumps transformers from 4.8.1 to 4.21.2.

Release notes

Sourced from transformers's releases.

v4.21.2: Patch release

Fix a regression in the TableQA pipeline: Fix a regression in Trainer checkpoint loading: #18428

# v4.21.1: Patch release

Fix a regression in Trainer checkpoint loading: #18470

v4.21.0: TF XLA text generation - Custom Pipelines - OwlViT, NLLB, MobileViT, Nezha, GroupViT, MVP, CodeGen, UL2

TensorFlow XLA Text Generation

The TensorFlow text generation method can now be wrapped with tf.function and compiled to XLA. You should be able to achieve up to 100x speedup this way. See our blog post and our benchmarks. You can also see XLA generation in action in our example notebooks, particularly for summarization and translation.

import tensorflow as tf
from transformers import AutoTokenizer, TFAutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("t5-small")
model = TFAutoModelForSeq2SeqLM.from_pretrained("t5-small")
Main changes with respect to the original generate workflow: tf.function and pad_to_multiple_of
xla_generate = tf.function(model.generate, jit_compile=True)
tokenization_kwargs = {"pad_to_multiple_of": 32, "padding": True, "return_tensors": "tf"}
The first prompt will be slow (compiling), the others will be very fast!
input_prompts = [
f"translate English to {language}: I have four cats and three dogs."
for language in ["German", "French", "Romanian"]
]
for input_prompt in input_prompts:
tokenized_inputs = tokenizer([input_prompt], **tokenization_kwargs)
generated_text = xla_generate(**tokenized_inputs, max_new_tokens=32)
print(tokenizer.decode(generated_text[0], skip_special_tokens=True))

New model additions

OwlViT

The OWL-ViT model (short for Vision Transformer for Open-World Localization) was proposed in Simple Open-Vocabulary Object Detection with Vision Transformers by Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby. OWL-ViT is an open-vocabulary object detection network trained on a variety of (image, text) pairs. It can be used to query an image with one or multiple text queries to search for and detect target objects described in text.

NLLB

... (truncated)

Commits
  • b487096 Patch release: v4.21.2
  • c5f7df8 Accept trust_remote_code and ignore it in PreTrainedModel.from_pretrained...
  • f0d4968 Patch release: v4.21.1
  • dea58d6 Fix load of model checkpoints in the Trainer (#18470)
  • a9eee2f Release: v4.21.0
  • 0daa202 Fix sacremoses sof dependency for Transofmers XL
  • 31b3a12 sentencepiece shouldn't be required for the fast LayoutXLM tokenizer
  • 3496ea8 Remove all uses of six (#18318)
  • 9e564d0 fix loading from pretrained for sharded model with `torch_dtype="auto" (#18061)
  • 36f9859 [EncoderDecoder] Improve docs (#18271)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [transformers](https://github.com/huggingface/transformers) from 4.8.1 to 4.21.2.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v4.8.1...v4.21.2)

---
updated-dependencies:
- dependency-name: transformers
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Aug 27, 2022
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Sep 10, 2022

Superseded by #98.

@dependabot dependabot bot closed this Sep 10, 2022
@dependabot dependabot bot deleted the dependabot/pip/python/requirements/tune/transformers-4.21.2 branch September 10, 2022 07:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants