Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CLIPTokenizer does not work as expected #2018

Open
fdtomasi opened this issue Dec 11, 2024 · 0 comments
Open

CLIPTokenizer does not work as expected #2018

fdtomasi opened this issue Dec 11, 2024 · 0 comments
Assignees

Comments

@fdtomasi
Copy link

To Reproduce

from keras_hub import models
tokenizer = models.Tokenizer.from_preset(
    "clip_vit_h_14_laion2b_s32b_b79k", 
    sequence_length=77,
    pad_with_end_token=True,
)
tokenizer = models.CLIPPreprocessor(tokenizer, sequence_length=77)
tokenizer(["a cat sitting on the table"])

which returns

{'token_ids': <tf.Tensor: shape=(1, 77), dtype=int32, numpy=
 array([[49406,   320,  2368,  4919,   525,   518,  2175,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0,     0,     0,     0,     0,     0,
             0,     0,     0,     0, 49407]], dtype=int32)>,
 'padding_mask': <tf.Tensor: shape=(1, 77), dtype=bool, numpy=
 array([[ True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True,  True,  True,  True,  True,
          True,  True,  True,  True,  True]])>}

This is surprising because of a few reasons. First, even if pad_with_end_token=True, the pad is using 0 (which correspond to ! in this vocabulary). Also, the end token is added at the end of the padding instead of the end of the original sequence.
Further, padding_mask is all True, while I would expect to be False in correspondence of padding tokens.

Additional context
Using keras_hub==0.18.1, keras==3.7.0.

@mehtamansi29 mehtamansi29 self-assigned this Dec 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants