Skip to content

[WIP] A complete 🎬 video & 🎵 audio library for Flutter & Dart.

License

Notifications You must be signed in to change notification settings

drwankingstein/media_kit

 
 

Repository files navigation

A complete video & audio library for Flutter & Dart.


Sponsored with 💖 by

Stream Chat


Rapidly ship in-app messaging with Stream's highly reliable chat infrastructure and feature-rich SDKs, including Flutter!

media_kit.mp4

Installation

Add in your pubspec.yaml:

dependencies:
  media_kit: ^0.0.1
  # For audio support.
  media_kit_core_audio: ^0.0.1
  # For video support.
  media_kit_core_video: ^0.0.1

NOTE: Only one of package:media_kit_core_video or package:media_kit_core_audio is required.

Platforms

  • Windows
  • Linux
  • macOS
  • Android
  • iOS

Docs

Brief Start

Basic example.

import 'package:media_kit/media_kit.dart';

/// Create a new [Player] instance.
final player = Player();

...
/// Open some [Media] for playback.
await player.open(
  Playlist(
    [
      Media('file:///C:/Users/Hitesh/Music/Sample.MP3'),
      Media('file:///C:/Users/Hitesh/Video/Sample.MKV'),
      Media('https://www.example.com/sample.mp4'),
      Media('rtsp://www.example.com/live'),
    ],
  ),
);

...
/// Modify speed, pitch, volume or shuffle state.
player.rate = 1.0;
player.pitch = 1.2;
player.volume = 50.0;
player.shuffle = false;

...
/// Play / Pause
player.play();
player.pause();
player.playOrPause();

...
/// Release allocated resources back to the system.
player.dispose();

...
/// Subscribe to events.
player.streams.playlist.listen((event) {
  /// Trigger UI updates etc.
  print(event);
});
player.streams.playlist.listen((event) {
  /// Trigger UI updates etc.
  print(event);
});
player.streams.position.listen((event) {
  /// Trigger UI updates etc.
  print(event);
});
player.streams.duration.listen((event) {
  /// Trigger UI updates etc.
  print(event);
});
player.streams.audioBitrate.listen((event) {
  /// Trigger UI updates etc.
  if (event != null) {
    print('${event ~/ 1000} KB/s');
  }
});

Rendering Video

Performant & H/W accelerated, automatically fallbacks to S/W rendering if system does not support it.

import 'package:media_kit/media_kit.dart';
import 'package:media_kit_core_video/media_kit_core_video.dart';

class MyScreen extends StatefulWidget {
  const MyScreen({Key? key}) : super(key: key);
  @override
  State<MyScreen> createState() => _MyScreenState();
}

class MyScreenState extends State<MyScreen> {
  // Create a [Player] instance from `package:media_kit`.
  final Player player = Player();
  // Reference to the [VideoController] instance from `package:media_kit_core_video`.
  VideoController? controller;

  @override
  void initState() {
    super.initState();
    WidgetsBinding.instance.addPostFrameCallback((_) async {
      // Create a [VideoController] instance from `package:media_kit_core_video`.
      // Pass the [handle] of the [Player] from `package:media_kit` to the [VideoController] constructor.
      controller = await VideoController.create(player.handle);
      setState(() {});
    });
  }

  @override
  Widget build(BuildContext context) {
    return Video(
      /// Pass the [controller] to display the video output.
      controller: controller,
    );
  }
}

For performance reasons (especially in S/W rendering), if you wish to restrain the size of each video frame, you can pass width & height parameters to the VideoController.create method.

controller = await VideoController.create(
  player.handle,
  width: 1920,
  height: 1080,
);

Detailed Guide

TODO: documentation

Try out the test application for more API usage examples.

Goals

The primary goal of package:media_kit is to become a strong, stable, feature-proof & modular media playback library for Flutter. The idea is to support both audio & video playback.

package:media_kit makes rendering hardware accelerated video playback possible in Flutter.

Since, targetting multiple features at once & bundling redundant native libraries can result in increased bundle size of the application, you can manually select the native libraries you want to bundle, depending upon your use-case. Currently, the scope of work is limited to Windows & Linux. The code is architectured to support multiple platforms & features. Support for more platforms will be added in future.

Support

If you find package:media_kit package(s) useful, please consider sponsoring me.

Since this is first of a kind project, it takes a lot of time to experiment & develop. It's a very tedious process to write code, document, maintain & provide support for free. Your support can ensure the quality of the package your project depends upon. I will feel rewarded for my hard-work & research.

Thanks!

Architecture

package:media_kit

Click on the zoom button on top-right or pinch inside.

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  Player *-- PlatformPlayer
  PlatformPlayer <|-- libmpv_Player
  PlatformPlayer <|-- xyz_Player
  PlatformPlayer *-- PlayerState
  PlatformPlayer *-- PlayerStreams
  PlatformPlayer o-- PlayerConfiguration

  libmpv_Player <.. NativeLibrary

  class Media {
    +String uri
    +dynamic extras
  }

  class Playlist {
    +List<Media> medias
    +index index
  }

  class PlayerConfiguration {
    + bool texture
    + bool osc
    + String vo
    + String title
    ... other platform-specific configurable values
  }

  class PlayerStreams {
    +Stream<Playlist> playlist
    +Stream<bool> isPlaying
    +Stream<bool> isCompleted
    +Stream<Duration> position
    +Stream<Duration> duration
    +Stream<double> volume
    +Stream<double> rate
    +Stream<double> pitch
    +Stream<bool> isBuffering
    +Stream<AudioParams> audioParams
    +Stream<double> audioBitrate
    +Stream<PlayerError> error
  }

  class PlayerState {
    +Playlist playlist
    +bool isPlaying
    +bool isCompleted
    +Duration position
    +Duration duration
    +double volume
    +double rate
    +double pitch
    +bool isBuffering
    +AudioParams audioParams
    +double audioBitrate
    +PlayerError error
  }

  class Player {
    +PlatformPlayer? platform

    +«get» PlayerState state
    +«get» PlayerStreams streams

    +«set» volume: double*
    +«set» rate: double*
    +«set» pitch: double*
    +«set» shuffle: bool*
    +«get» handle: Future<int>

    +open(playlist)
    +play()
    +pause()
    +playOrPause()
    +add(media)
    +remove(index)
    +next()
    +previous()
    +jump(index)
    +move(from, to)
    +seek(duration)
    +setPlaylistMode(playlistMode)
    +dispose()
  }

  class PlatformPlayer {
    +PlayerState state
    +PlayerStreams streams
    +PlayerConfiguration configuration

    +open(playlist)*
    +play()*
    +pause()*
    +playOrPause()*
    +add(media)*
    +remove(index)*
    +next()*
    +previous()*
    +jump(index)*
    +move(from, to)*
    +seek(duration)*
    +setPlaylistMode(playlistMode)*
    +dispose()*

    +«set» volume: double*
    +«set» rate: double*
    +«set» pitch: double*
    +«set» shuffle: bool*
    +«get» handle: Future<int>*

    #StreamController<Playlist> playlistController
    #StreamController<bool> isPlayingController
    #StreamController<bool> isCompletedController
    #StreamController<Duration> positionController
    #StreamController<Duration> durationController
    #StreamController<double> volumeController
    #StreamController<double> rateController
    #StreamController<double> pitchController
    #StreamController<bool> isBufferingController
    #StreamController<PlayerError> errorController
    #StreamController<AudioParams> audioParamsController
    #StreamController<double?> audioBitrateController
  }

  class libmpv_Player {
    +open(playlist)
    +play()
    +pause()
    +playOrPause()
    +add(media)
    +remove(index)
    +next()
    +previous()
    +jump(index)
    +move(from, to)
    +seek(duration)
    +setPlaylistMode(playlistMode)
    +«set» volume: double
    +«set» rate: double
    +«set» pitch: double
    +«set» shuffle: bool
    +«get» handle: Future<int>
    +dispose()
  }

  class NativeLibrary {
    +find()$ String?
  }

  class xyz_Player {
    +open(playlist)
    +play()
    +pause()
    +playOrPause()
    +add(media)
    +remove(index)
    +next()
    +previous()
    +jump(index)
    +move(from, to)
    +seek(duration)
    +setPlaylistMode(playlistMode)
    +«set» volume: double
    +«set» rate: double
    +«set» pitch: double
    +«set» shuffle: bool
    +«get» handle: Future<int>
    +dispose()
  }
Loading

package:media_kit_core_audio

N/A

package:media_kit_core_video

Click on the zoom button on top-right or pinch inside.

Windows

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitCoreVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput
  VideoOutput "1" *-- "1" ANGLESurfaceManager: Only for H/W accelerated rendering.

  class MediaKitCoreVideoPlugin {
    -flutter::PluginRegistrarWindows registrar_
    -std::unique_ptr<MethodChannel> channel_
    -std::unique_ptr<VideoOutputManager> video_output_manager_
    -HandleMethodCall(method_call, result);
  }

  class VideoOutputManager {
    +Create(handle: int, width: optional<int>, height: optional<int>)
    +Dispose(handle: int)

    -flutter::PluginRegistrarWindows registrar_
    -std::unique_ptr<MethodChannel> channel_
    -std::unordered_map<int64_t, std::unique_ptr<VideoOutput>> video_outputs_
  }

  class VideoOutput {
    +«get» texture_id: int64_t
    -mpv_handle* handle
    -mpv_render_context* context
    -std::optional<int64_t> width
    -std::optional<int64_t> height
    -int64_t texture_id_
    -flutter::PluginRegistrarWindows registrar_
    -std::unique_ptr<flutter::TextureVariant> texture_variant_
    -std::unique_ptr<ANGLESurfaceManager> surface_manager_ HW
    -std::unique_ptr<FlutterDesktopGpuSurfaceDescriptor> texture_ HW
    -std::unique_ptr<uint8_t[]> pixel_buffer_ SW
    -std::unique_ptr<FlutterDesktopPixelBuffer> pixel_buffer_texture_ SW
    -std::mutex mutex_
    -std::function texture_update_callback_

    +SetTextureUpdateCallback(callback: std::function<void(int64_t, int64_t, int64_t)>)
    -Render()
    -Resize(width: int64_t, height: int64_t)
    -GetVideoWidth(): int64_t
    -GetVideoHeight(): int64_t
  }

  class ANGLESurfaceManager {
    +«get» width: int32_t
    +«get» height: int32_t
    +«get» handle: HANDLE

    +HandleResize(width: int64_t, height: int64_t)
    +SwapBuffers()
    +MakeCurrent(value: bool)
    -Initialize()
    -InitializeD3D11()
    -InitializeD3D9()
    -CleanUp(release_context: bool)
    -CreateAndBindEGLSurface()
    -ShowFailureMessage(message: wchar_t[])

    -IDXGIAdapter* adapter_
    -int32_t width_
    -int32_t height_
    -ID3D11Device* d3d_11_device_
    -ID3D11DeviceContext* d3d_11_device_context_
    -Microsoft::WRL::ComPtr<ID3D11Texture2D>
    -Microsoft::WRL::ComPtr<IDXGISwapChain>
    -IDirect3D9Ex* d3d_9_ex_
    -IDirect3DDevice9Ex* d3d_9_device_ex_
    -IDirect3DTexture9* d3d_9_texture_
    -HANDLE handle_
    -EGLSurface surface_
    -EGLDisplay display_
    -EGLContext context_
    -EGLConfig config_
  }
Loading

Implementation

package:media_kit

package:media_kit is entirely written in Dart. It uses dart:ffi to invoke native C API of libmpv through it's shared libraries. All the callback management, event-Streams, other methods to control playback of audio/video are implemented in Dart with the help of FFI. Event management i.e. position, duration, bitrate, audioParams Streams are important to render changes in the UI.

A big limitation with FFI in Dart SDK has been that it does not support async callbacks from another thread. Learn more about this at: dart/sdk#37022. Following situation will explain better:

If you pass a function pointer from Dart to C code, you can invoke it fine. But, as soon as you invoke it from some other thread on the native side, Dart VM will instantly crash. This feature is important because most events take place on a background thread.

However, I could easily do this within Dart because libmpv offers an "event polling"-like way to listen to events. I got awesome idea to spawn a background Isolate, where I run the event-loop. I get the memory address of each event and forward it outside the Isolate with the help of ReceivePort, where I finally interpret it using more FFI code. I have explained this in detail within in-code comments of initializer.dart, I had to do a lot more trickery to get this to work.

This solved the issue of events & audio playback within 100% Dart using FFI.

However, no such "event-polling" like API is possible for video rendering. It won't be performant to constantly do polling of video frames off a thread & forward frames back to primary thread for rendering. libmpv does not have any such API anyway. So, I created new package package:media_kit_core_video for specifically offering platform-specific video playback implementation which internally handles Flutter's Texture Registry API & libmpv's OpenGL rendering API. This package only consumes the mpv_handle* (which can be shared as primitive int value easily) of the instance (created with package:media_kit through FFI) to setup a new viewport. Detailed implementation is discussed below.

package:media_kit_core_audio

libmpv from mpv Media Player is used for leveraging audio playback. A minimal variant with stripped down features & only audio support is bundled for usage through dart:ffi.

package:media_kit_core_video

Windows

libmpv from mpv Media Player is used for leveraging video playback.

The two APIs above are hardware accelerated i.e. GPU backed buffers are used. This is performant approach, easily capable for rendering 4K 60 FPS videos, rest depends on the hardware. Since libmpv API is OpenGL based & the Texture API in Flutter is Direct3D based, ANGLE (Almost Native Graphics Layer Engine) is used for interop, which automatically translates the OpenGL ES calls into Direct3D.

This hardware accelerated video output requires DirectX 11 or higher. Most Windows systems with either integrated or discrete GPUs should support this already. On systems where Direct3D fails to load due to missing graphics drivers or unsupported feature-level or DirectX version etc. a fallback pixel-buffer based software renderer is used. This means that video is rendered by CPU & every frame is copied back to the RAM. This will cause some redundant load on the CPU, result in decreased battery life & may not play higher resolution videos properly. However, it works.

You can visit my experimentation repository to see a minimal example showing OpenGL ES rendering inside Flutter Windows.

Linux

libmpv from mpv Media Player is used for leveraging video playback. System shared libraries from distribution specific user-installed packages are used by-default. On Ubuntu / Debian based systems, you can install these using:

NOTE: This package also bundles specific shared libraries & dependencies.

sudo apt install mpv libmpv-dev

On Flutter Linux, both OpenGL (hardware accelerated) & Pixel Buffer (software) APIs are available for rendering on Texture widget.

Outcomes

4K video playback on entry-level AMD Ryzen 3 2200U processor with Radeon Vega 3 Mobile Graphics.

NOTES:

  • See process specific CPU & GPU usage (media_kit_test.exe). Overall CPU usage is high due to screen recording.
  • Memory usage is higher because of higher resolution 4K video. General usage will be lower.
media_kit.mp4

License

Copyright © 2022, Hitesh Kumar Saini <[email protected]>

This project & the work under this repository is governed by MIT license that can be found in the LICENSE file.

About

[WIP] A complete 🎬 video & 🎵 audio library for Flutter & Dart.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Dart 64.7%
  • C++ 25.1%
  • CMake 9.4%
  • C 0.8%