Skip to content

Releases: openvinotoolkit/openvino_testdrive

v24.4.0

06 Nov 08:41
Compare
Choose a tag to compare
v24.4.0 Pre-release
Pre-release

🚀 [v24.4.0]

Summary of major features

  • Download LLM Models: Access models from the Huggingface OpenVINO LLM collection.
  • Text Generation: Utilize LLM models for generating text.
  • Chat Templates: Implement LLM with pre-defined chat templates.
  • Performance Metrics: Monitor LLM model performance, including load time, time to first token, and time per output token.
  • Configuration Options: Adjust temperature and Top P settings to control the variability and randomness of generated responses.
  • Inference for Computer Vision Models: Run inference on all computer vision models trained using Intel Geti.
  • Task-Chain Inference: Execute inference for all task-chains created by Intel Geti.
  • Model Testing: Test computer vision models using a sample image or an image from the local disk.
  • Batch Inference: Perform batch inference for computer vision by selecting source and destination folders.
  • Processor Selection: Choose processor types (CPU, GPU, NPU) for model execution.
  • Windows Installer: Install via MSIX for a seamless setup.

Known issues

  • An auto plugin might trigger issues in a large language model (LLM).

Download this release

Download windows installer of V24.4.0 beta release of OpenVINO Test Drive