Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cache can't cleaned or disabled #7260

Open
charliedream1 opened this issue Oct 29, 2024 · 0 comments
Open

cache can't cleaned or disabled #7260

charliedream1 opened this issue Oct 29, 2024 · 0 comments

Comments

@charliedream1
Copy link

charliedream1 commented Oct 29, 2024

Describe the bug

I tried following ways, the cache can't be disabled.

I got 2T data, but I also got more than 2T cache file. I got pressure on storage. I need to diable cache or cleaned immediately after processed. Following ways are all not working, please give some help!

from datasets import disable_caching
from transformers import AutoTokenizer
disable_caching()

tokenizer = AutoTokenizer.from_pretrained(args.tokenizer_path)
def tokenization_fn(examples):
    column_name = 'text' if 'text' in examples else 'data'
    tokenized_inputs = tokenizer(
        examples[column_name], return_special_tokens_mask=True, truncation=False,
        max_length=tokenizer.model_max_length
    )
    return tokenized_inputs

data = load_dataset('json', data_files=save_local_path, split='train', cache_dir=None)
data.cleanup_cache_files()
updated_dataset = data.map(tokenization_fn, load_from_cache_file=False)
updated_dataset .cleanup_cache_files()

Expected behavior

no cache file generated

Environment info

Ubuntu 20.04.6 LTS
datasets 3.0.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant