Skip to content

My fork of an open source light-weight and high performance inference framework for Hailo devices

Notifications You must be signed in to change notification settings

bernd-heihoff/hailort

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HailoRT

HailoRT is a lightweight, production-grade runtime library that runs on the host processor and provides a robust user-space library (the HailoRT Library) with intuitive APIs in C/C++ for optimized performance

HailoRT consists of the following main components:

  • HailoRT Library.
  • HailoRT CLI - a command line application used to control the Hailo device, run inferences, collect statistics and device events, etc.
  • HailoRT PCIe Driver - the device driver used to manage the Hailo device, communicate with the device, and transfer data to/from the device; it includes the Hailo-8 firmware that runs on the Hailo device and manages its boot and control.
  • pyHailoRT - HailoRT Python API, which wraps the runtime library.
  • HailoRT GStreamer element (HailoNet).

HailoRT supports Linux and Windows, and it can be compiled from sources to be integrated with various x86 and ARM processors.

Usage

See hailo.ai developer zone documentation (registration is required for full documentation access).

For compilation instructions, see Compiling HailoRT from Sources.

For HailoRT API examples - see HailoRT examples.

Changelog

See hailo.ai developer zone - HailoRT changelog (registration required).

Licenses

HailoRT uses 2 licenses:

  • libhailort, pyhailort & hailortcli - distributed under the MIT license
  • hailonet (GStreamer plugin) - distributed under the LGPL 2.1 license

Contact

Contact information and support is available at hailo.ai.

About Hailo

Hailo offers breakthrough AI Inference Accelerators and AI Vision Processors uniquely designed to accelerate embedded deep learning applications on edge devices.

The Hailo AI Inference Accelerators allow edge devices to run deep learning applications at full scale more efficiently, effectively, and sustainably, with an architecture that takes advantage of the core properties of neural networks.

The Hailo AI Vision Processors (SoC) combine Hailo's patented and field proven AI inferencing capabilities with advanced computer vision engines, generating premium image quality and advanced video analytics.

For more information, please visit hailo.ai.

About

My fork of an open source light-weight and high performance inference framework for Hailo devices

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 82.3%
  • C 11.4%
  • Python 4.2%
  • CMake 1.4%
  • Jupyter Notebook 0.5%
  • Shell 0.2%