Skip to content

Web-first SDK that provides real-time ARKit-compatible 52 blend shapes from a camera feed, video or image at 60 FPS using ML models.

License

Notifications You must be signed in to change notification settings

nemg2004/AvatarWebKit

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AvatarWebKit by Hallway

AvatarWebKit is an SDK, developed by Hallway, optimized for the web that provides real-time blend shapes from a camera feed, video or image. The SDK also gives head X/Y position, depth (Z) and rotation (pitch, roll, yaw) for each frame. AvatarWebKit runs at 60 FPS and provides ARKit-compatible 52 blend shapes.

In the future, the SDK will be able to provide rigid body frame and hand positions as well.

Hallway drives our avatar technology using Machine Learning models that predict highly accurate blend shapes from images & video feeds in real-time. The ML pipeline is optimized for real-time video to achieve both high framerate and lifelike animations.

Our vision for the future is an "open metaverse" where you can take your character with you anywhere. We believe tools like AvatarWebKit will help pave that road. The models we've provided here are available to use in your applications for free. Contact us to get in touch about making your characters compatible with Hallway!

Installation

# yarn
yarn add @quarkworks-inc/avatar-webkit

# npm
npm install @quarkworks-inc/avatar-webkit

First Steps

  1. Get an API token

  2. Start your predictor:

import { AUPredictor } from '@quarkworks-inc/avatar-webkit'
// ...

let predictor = new AUPredictor({
  apiToken: <YOUR_API_TOKEN>,
  shouldMirrorOutput: true,
})

let stream = await navigator.mediaDevices.getUserMedia({
  audio: false,
  video: {
    width: { ideal: 640 },
    height: { ideal: 360 },
    facingMode: 'user'
  }
})

predictor.onPredict = (results => {
  console.log(results)
})

// or if you like RX
predictor.dataStream.subscribe(results => {
  console.log(results)
})

predictor.start({ stream })

More Docs

Example Projects

Using AvatarWebKit

Popular model integrations

FAQ

API Token? What is that and why do I need it?

An API key is your unique identifier that will allow your code to authenticate with our server when using the SDK. You can sign up for one here.

What browsers are supported?

We recommend Chromium based browsers for best performance, but all other major browsers are supported. We are currently working on performance improvements for Safari, Firefox and Edge.

Is mobile supported?

The models will currently run on mobile but need to be optimized. We are working on configuration options which will allow you to choose to run lighter models.

Do you have any native SDKs?

We do not have an official SDK yet, but our ML pipeline is native-first and the models are used in our Mac OS app Hallway Tile. We have the capability to create SDKs for most common platforms (eg macOS/Windows/Linux, iOS/Android). Each SDK will follow the same data standard for BlendShapes/predictions and will include encoders for portability between environments. This means you can do some creative things between native, web, etc!

If you are interested in native SDKs, we'd love to hear from you!

Is this production ready?

Yes, depending on your needs. There may be a couple rough edges at the moment, but the SDK has been in use internally at our company for over a year and in production with several pilot companies.

We are currently making no SLAs for the SDK, but we are happy to cooperate with you on any improvements you need to get it going in production.

Can I make feature requests?

YES!!! We are in an open beta currently and would love to hear your feedback. Contact us on Discord or by email.

What’s the best place to reach out for support?

We are active daily on our and can help with any problems you may have! If discord doesn’t work for you, reach out to

Contact Us

Our team is primarily in U.S. timezones, but we are pretty active on Discord and over email! We've love to hear your thoughts, feedback, ideas or provide any support you need.

Other Hallway Tools

If you are using Three.js, we've released this open source tooling module you can import freely. This pairs especially well with video-call style apps, as we provide a three world setup that works well for rendering multiple avatars on screen at once Zoom-style.

TODO

More coming :)

About

Web-first SDK that provides real-time ARKit-compatible 52 blend shapes from a camera feed, video or image at 60 FPS using ML models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published