Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HF related warning is raised with AmazonBedrockGenerator #211

Closed
bilgeyucel opened this issue Jan 15, 2024 · 2 comments
Closed

HF related warning is raised with AmazonBedrockGenerator #211

bilgeyucel opened this issue Jan 15, 2024 · 2 comments
Labels

Comments

@bilgeyucel
Copy link
Contributor

bilgeyucel commented Jan 15, 2024

Describe the bug
I get a warning related to Hugging Face when using the AmazonBedrockGenerator on Colab. Everything works as expected

Warning:
image

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py:88: UserWarning: 
The secret `HF_TOKEN` does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
  warnings.warn(

There's an issue here that might be related: huggingface/huggingface_hub#1929

To Reproduce
Run the code below after installing pip install amazon-bedrock-haystack

from amazon_bedrock_haystack.generators.amazon_bedrock import AmazonBedrockGenerator

## Initialize the AmazonBedrockGenerator with an Amazon Bedrock model
bedrock_model = 'amazon.titan-text-express-v1'
generator = AmazonBedrockGenerator(model_name=bedrock_model,
                                   aws_access_key_id=aws_access_key_id,
                                   aws_secret_access_key=aws_secret_access_key,
                                   aws_region_name="us-east-1",
                                   max_length=500)

Describe your environment (please complete the following information):

  • OS: [e.g. iOS] Colab
  • Haystack version: Haystack 2.0-beta.4
  • Integration version: amazon-bedrock-haystack 0.1.0
@bilgeyucel bilgeyucel added bug Something isn't working integration:amazon-bedrock labels Jan 15, 2024
@masci masci added the P1 label Jan 15, 2024
@anakin87
Copy link
Member

Under the hood, AmazonBedrockGenerator is using transformers for tokenization.
Lately, you always get this warning when using transformers in Colab, unless you have set the HF_TOKEN.

An unrelated example that raises the same warning:

Example
# ! pip install accelerate

import torch
from transformers import pipeline

pipe = pipeline("text-generation", model="TinyLlama/TinyLlama-1.1B-Chat-v1.0", torch_dtype=torch.bfloat16, device_map="auto")

# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
    {
        "role": "system",
        "content": "You are a friendly chatbot who always responds in the style of a pirate",
    },
    {"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])

So I would say there is nothing we can do to prevent this warning...

@bilgeyucel
Copy link
Contributor Author

Thanks for the explanation @anakin87. Closing this now!

@bilgeyucel bilgeyucel closed this as not planned Won't fix, can't repro, duplicate, stale Jan 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants