Skip to content

Commit

Permalink
update doc
Browse files Browse the repository at this point in the history
  • Loading branch information
riteshghorse committed Sep 19, 2023
1 parent a1b7fb6 commit 87bd300
Showing 1 changed file with 11 additions and 6 deletions.
17 changes: 11 additions & 6 deletions sdks/python/apache_beam/ml/inference/huggingface_inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -571,7 +571,7 @@ class HuggingFacePipelineModelHandler(ModelHandler[str,
def __init__(
self,
task: Union[str, PipelineTask] = "",
model=None,
model: str = "",
*,
inference_fn: PipelineInferenceFn = _default_pipeline_inference_fn,
load_pipeline_args: Optional[Dict[str, Any]] = None,
Expand All @@ -594,11 +594,16 @@ def __init__(
Args:
task (str or enum.Enum): task supported by HuggingFace Pipelines.
Accepts a string task or an enum.Enum from PipelineTask.
model : path to the pretrained model-id on Hugging Face Models Hub
to use custom model for the chosen task. If the model already defines
the task then no need to specify the task parameter. Use the model_id
string instead of an actual model here. Model specific kwargs can be
specified with model_kwargs using load_pipeline_args.
model (str): path to the pretrained model-id on Hugging Face Models Hub
to use custom model for the chosen task. If the `model` already defines
the task then no need to specify the `task` parameter. Use the model-id
string instead of an actual model here.
Model-specific kwargs for `from_pretrained(..., **model_kwargs)` can be
specified with `model_kwargs` using `load_pipeline_args`.
Example Usage:
model_handler = HuggingFacePipelineModelHandler(
model_uri="bert-base-uncased", model="meta-llama/Llama-2-7b-hf",
load_pipeline_args={'model_kwargs':{'quantization_map':config}})
inference_fn: the inference function to use during RunInference.
Default is _default_pipeline_inference_fn.
load_pipeline_args (Dict[str, Any]): keyword arguments to provide load
Expand Down

0 comments on commit 87bd300

Please sign in to comment.