Skip to content

[WIP] A complete 🎬 video & 🎡 audio library for Flutter & Dart.

License

Notifications You must be signed in to change notification settings

AhmedAbouelkher/media_kit

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

A complete video & audio playback library for Flutter & Dart. Performant, stable, feature-proof & modular.

Github Actions

package:media_kit is in active development & documentation is under construction... πŸ—οΈ

Sponsored with πŸ’– by

Stream Chat
Rapidly ship in-app messaging with Stream's highly reliable chat infrastructure & feature-rich SDKs, including Flutter!
Try the Flutter Chat tutorial



Stream Chat

Clever Apps for Film Professionals
media_kit.mp4

Notes:

  • The video is recorded on entry-level AMD Ryzen 3 2200U with integrated Radeon Vega 3 Mobile Graphics.
  • See process specific resource usage i.e. media_kit_test.exe. Overall usage is high due to screen recording.

Installation

Add in your pubspec.yaml:

dependencies:
  media_kit: ^0.0.4
  # For video rendering.
  media_kit_video: ^0.0.4
  # Enables support for higher number of concurrent instances. Optional.
  media_kit_native_event_loop: ^1.0.2
  # Pick based on your requirements / platform:
  media_kit_libs_windows_video: ^1.0.1          # Windows package for video (& audio) native libraries.
  media_kit_libs_ios_video: ^1.0.4              # iOS package for video (& audio) native libraries.
  media_kit_libs_macos_video: ^1.0.4            # macOS package for video (& audio) native libraries.
  media_kit_libs_linux: ^1.0.1                  # Linux dependency package.

Platforms

Platform Audio Video
Windows Ready Ready
GNU/Linux Ready Ready
macOS Ready Ready
iOS Ready Ready
Android WIP WIP
Web WIP WIP

Guide

Control audio or video playback

import 'package:media_kit/media_kit.dart';

/// Create a [Player] instance for audio or video playback.

final player = Player();

/// Subscribe to event streams & listen to updates.

player.streams.playlist.listen((e) => print(e));
player.streams.playing.listen((e) => print(e));
player.streams.completed.listen((e) => print(e));
player.streams.position.listen((e) => print(e));
player.streams.duration.listen((e) => print(e));
player.streams.volume.listen((e) => print(e));
player.streams.rate.listen((e) => print(e));
player.streams.pitch.listen((e) => print(e));
player.streams.buffering.listen((e) => print(e));
player.streams.audioParams.listen((e) => print(e));
player.streams.audioBitrate.listen((e) => print(e));
player.streams.audioDevice.listen((e) => print(e));
player.streams.audioDevices.listen((e) => print(e));


/// Open a playable [Media] or [Playlist].

await player.open(Media('file:///C:/Users/Hitesh/Music/Sample.mp3'));
await player.open(Media('file:///C:/Users/Hitesh/Video/Sample.mkv'));
await player.open(Media('asset:///videos/bee.mp4'));
await player.open(
  Playlist(
    [
      Media('https://www.example.com/sample.mp4'),
      Media('rtsp://www.example.com/live'),
    ],
    /// Select the starting index.
    index: 1,
  ),
  /// Select whether playback should start or not.
  play: false,
);

/// Control playback state.

await player.play();
await player.pause();
await player.playOrPause();
await player.seek(const Duration(seconds: 10));

/// Use or modify the queue.

await player.next();
await player.previous();
await player.jump(2);
await player.add(Media('https://www.example.com/sample.mp4'));
await player.move(0, 2);

/// Customize speed, pitch, volume, shuffle, playlist mode, audio device.

await player.setRate(1.0);
await player.setPitch(1.2);
await player.setVolume(50.0);
await player.setShuffle(false);
await player.setPlaylistMode(PlaylistMode.loop);
await player.setAudioDevice(AudioDevice.auto());

/// Release allocated resources back to the system.

await player.dispose();

Display video output

Performant & H/W accelerated, automatically fallbacks to S/W rendering if system does not support it.

import 'package:flutter/material.dart';

import 'package:media_kit/media_kit.dart';                        /// Provides [Player], [Media], [Playlist] etc.
import 'package:media_kit_video/media_kit_video.dart';            /// Provides [VideoController] & [Video] etc.

class MyScreen extends StatefulWidget {
  const MyScreen({Key? key}) : super(key: key);
  @override
  State<MyScreen> createState() => _MyScreenState();
}

class MyScreenState extends State<MyScreen> {
  /// Create a [Player].
  final Player player = Player();
  /// Store reference to the [VideoController].
  VideoController? controller;

  @override
  void initState() {
    super.initState();
    Future.microtask(() async {
      /// Create a [VideoController] to show video output of the [Player].
      controller = await VideoController.create(player);
      setState(() {});
    });
  }

  @override
  void dispose() {
    Future.microtask(() async {
      /// Release allocated resources back to the system.
      await controller?.dispose();
      await player.dispose();
    });
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    /// Use [Video] widget to display the output.
    return Video(
      /// Pass the [controller].
      controller: controller,
    );
  }
}

Performance

Although package:media_kit is already fairly performant, you can further optimize things as follows:

Note

  • You can limit size of the video output by specifying width & height.
  • By default, both height & width are null i.e. output is based on video's resolution.
final controller = await VideoController.create(
  player,
  width: 640,                                   // default: null
  height: 360,                                  // default: null
);

Note

  • You can switch between GPU & CPU rendering by specifying enableHardwareAcceleration.
  • By default, enableHardwareAcceleration is true i.e. GPU (Direct3D/OpenGL/METAL) is utilized.
final controller = await VideoController.create(
  player,
  enableHardwareAcceleration: false,            // default: true
);

Note

  • You can disable event callbacks for a Player & save yourself few CPU cycles.
  • By default, events is true i.e. event streams & states are updated.
final player = Player(
  configuration: PlayerConfiguration(
    events: false,                              // default: true
  ),
);

Detailed Guide

TODO: documentation

Try out the test application for now.

Setup

Windows

Windows 7 or higher is supported.

Everything ready. Just add only one of the following packages to your pubspec.yaml (if not already):

dependencies:
  ...
  media_kit_libs_windows_video: ^1.0.1       # Windows package for video (& audio) native libraries.
  media_kit_libs_windows_audio: ^1.0.1       # Windows package for audio (only) native libraries.

Linux

Any modern Linux distribution is supported.

System shared libraries from distribution specific user-installed packages are used by-default. You can install these as follows.

Ubuntu / Debian

sudo apt install libmpv-dev mpv

Packaging

There are other ways to bundle these within your app package e.g. within Snap or Flatpak. Few examples:

Add following package to your pubspec.yaml (if not already).

dependencies:
  ...
  media_kit_libs_linux: ^1.0.1                   # GNU/Linux dependency package.

macOS

macOS 11.0 or higher is supported.

Everything ready. Just add only one of the following packages to your pubspec.yaml (if not already).

dependencies:
  ...
  media_kit_libs_macos_video: ^1.0.4             # macOS package for video (& audio) native libraries.
  media_kit_libs_macos_audio: ^1.0.4             # macOS package for audio (only) native libraries.

During the build phase, the following warnings are not critical and cannot be silenced:

#import "Headers/media_kit_video-Swift.h"
        ^
/path/to/media_kit/media_kit_test/build/macos/Build/Products/Debug/media_kit_video/media_kit_video.framework/Headers/media_kit_video-Swift.h:270:31: warning: 'objc_ownership' only applies to Objective-C object or block pointer types; type here is 'CVPixelBufferRef' (aka 'struct __CVBuffer *')
- (CVPixelBufferRef _Nullable __unsafe_unretained)copyPixelBuffer SWIFT_WARN_UNUSED_RESULT;
# 1 "<command line>" 1
 ^
<command line>:20:9: warning: 'POD_CONFIGURATION_DEBUG' macro redefined
#define POD_CONFIGURATION_DEBUG 1 DEBUG=1 
        ^
#define POD_CONFIGURATION_DEBUG 1
        ^

iOS

iOS 13.0 or higher is supported.

Everything ready. Just add only one of the following packages to your pubspec.yaml (if not already):

dependencies:
  media_kit_libs_ios_video: ^1.0.4              # iOS package for video (& audio) native libraries.
  media_kit_libs_ios_audio: ^1.0.4              # iOS package for audio (only) native libraries.

Known bug: The sound does not work in the simulator.

Goals

package:media_kit is a library for Flutter & Dart which provides audio & video playback.

  • Strong: Supports most audio & video codecs.
  • Performant:
    • Handles multiple FHD videos flawlessly.
    • Rendering is GPU powered (hardware accelerated).
    • 4K / 8K 60 FPS is supported.
  • Stable: Implementation is well tested & used across number of intensive media playback related apps.
  • Feature Proof: A simple usage API offering large number of features to target multitude of apps.
  • Modular: Project is split into number of packages for reducing bundle size.
  • Cross Platform: Implementation works on all platforms supported by Flutter & Dart:
    • Windows
    • GNU/Linux
    • macOS
    • iOS
    • Android [WIP]
    • Web [WIP]
  • Flexible Architecture:
    • Major part of implementation (80%+) is in 100% Dart (using FFI) & shared across platforms.
      • Makes behavior of library same & more predictable across platforms.
      • Makes development & implementation of new features easier & faster.
      • Avoids separate maintenance of native implementation for each platform.
    • Only video embedding code is platform specific & part of separate package.
    • Learn more: Architecture, Implementation.

The project aims to meet demands of the community, this includes:

  1. Holding accountability.
  2. Ensuring timely maintenance.

Fund Development

If you find package:media_kit package(s) useful, please consider sponsoring me.

Since this is first of a kind project, it takes a lot of time to experiment & develop. It's a very tedious process to write code, document, maintain & provide support for free. Your support can ensure the quality of the package your project depends upon. I will feel rewarded for my hard-work & research.

Thanks!

Architecture

package:media_kit

Click on the zoom button on top-right or pinch inside.

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  Player *-- PlatformPlayer
  PlatformPlayer <|-- libmpv_Player
  PlatformPlayer <|-- xyz_Player
  PlatformPlayer *-- PlayerState
  PlatformPlayer *-- PlayerStreams
  PlatformPlayer o-- PlayerConfiguration

  libmpv_Player <.. NativeLibrary

  class Media {
    +String uri
    +dynamic extras
  }

  class Playlist {
    +List<Media> medias
    +index index
  }

  class PlayerConfiguration {
    + bool osc
    + String vo
    + String title
    ... other platform-specific configurable values
  }

  class PlayerStreams {
    +Stream<Playlist> playlist
    +Stream<bool> isPlaying
    +Stream<bool> isCompleted
    +Stream<Duration> position
    +Stream<Duration> duration
    +Stream<double> volume
    +Stream<double> rate
    +Stream<double> pitch
    +Stream<bool> isBuffering
    +Stream<AudioParams> audioParams
    +Stream<double> audioBitrate
    +Stream<PlayerError> error
  }

  class PlayerState {
    +Playlist playlist
    +bool isPlaying
    +bool isCompleted
    +Duration position
    +Duration duration
    +double volume
    +double rate
    +double pitch
    +bool isBuffering
    +AudioParams audioParams
    +double audioBitrate
    +PlayerError error
  }

  class Player {
    +PlatformPlayer? platform

    +Β«getΒ» PlayerState state
    +Β«getΒ» PlayerStreams streams

    +Β«setΒ» volume: double*
    +Β«setΒ» rate: double*
    +Β«setΒ» pitch: double*
    +Β«setΒ» shuffle: bool*
    +Β«getΒ» handle: Future<int>

    +open(playlist)
    +play()
    +pause()
    +playOrPause()
    +add(media)
    +remove(index)
    +next()
    +previous()
    +jump(index)
    +move(from, to)
    +seek(duration)
    +setPlaylistMode(playlistMode)
    +dispose()
  }

  class PlatformPlayer {
    +PlayerState state
    +PlayerStreams streams
    +PlayerConfiguration configuration

    +open(playlist)*
    +play()*
    +pause()*
    +playOrPause()*
    +add(media)*
    +remove(index)*
    +next()*
    +previous()*
    +jump(index)*
    +move(from, to)*
    +seek(duration)*
    +setPlaylistMode(playlistMode)*
    +dispose()*

    +Β«setΒ» volume: double*
    +Β«setΒ» rate: double*
    +Β«setΒ» pitch: double*
    +Β«setΒ» shuffle: bool*
    +Β«getΒ» handle: Future<int>*

    #StreamController<Playlist> playlistController
    #StreamController<bool> isPlayingController
    #StreamController<bool> isCompletedController
    #StreamController<Duration> positionController
    #StreamController<Duration> durationController
    #StreamController<double> volumeController
    #StreamController<double> rateController
    #StreamController<double> pitchController
    #StreamController<bool> isBufferingController
    #StreamController<PlayerError> errorController
    #StreamController<AudioParams> audioParamsController
    #StreamController<double?> audioBitrateController
  }

  class libmpv_Player {
    +open(playlist)
    +play()
    +pause()
    +playOrPause()
    +add(media)
    +remove(index)
    +next()
    +previous()
    +jump(index)
    +move(from, to)
    +seek(duration)
    +setPlaylistMode(playlistMode)
    +Β«setΒ» volume: double
    +Β«setΒ» rate: double
    +Β«setΒ» pitch: double
    +Β«setΒ» shuffle: bool
    +Β«getΒ» handle: Future<int>
    +dispose()
  }

  class NativeLibrary {
    +find()$ String?
  }

  class xyz_Player {
    +open(playlist)
    +play()
    +pause()
    +playOrPause()
    +add(media)
    +remove(index)
    +next()
    +previous()
    +jump(index)
    +move(from, to)
    +seek(duration)
    +setPlaylistMode(playlistMode)
    +Β«setΒ» volume: double
    +Β«setΒ» rate: double
    +Β«setΒ» pitch: double
    +Β«setΒ» shuffle: bool
    +Β«getΒ» handle: Future<int>
    +dispose()
  }
Loading

package:media_kit_video

Click on the zoom button on top-right or pinch inside.

Windows

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Takes PluginRegistrarWindows as reference
  VideoOutputManager "1" *-- "1" ThreadPool
  VideoOutput "*" o-- "1" ThreadPool: Post creation, resize & render etc. tasks involving EGL to ensure synchronous EGL/ANGLE usage across multiple VideoOutput(s)
  VideoOutput "1" *-- "1" ANGLESurfaceManager: Only for H/W accelerated rendering

  class MediaKitVideoPlugin {
    -flutter::PluginRegistrarWindows registrar_
    -std::unique_ptr<MethodChannel> channel_
    -std::unique_ptr<VideoOutputManager> video_output_manager_
    -HandleMethodCall(method_call, result);
  }

  class ThreadPool {
    +Post(function: std::function)
  }

  class VideoOutputManager {
    +Create(handle: int, width: optional<int>, height: optional<int>, texture_update_callback: std::function)
    +Dispose(handle: int)

    -std::mutex mutex_
    -std::unique_ptr<ThreadPool> thread_pool_
    -flutter::PluginRegistrarWindows registrar_
    -std::unordered_map<int64_t, std::unique_ptr<VideoOutput>> video_outputs_
  }

  class VideoOutput {
    +Β«getΒ» texture_id: int64_t
    -mpv_handle* handle
    -mpv_render_context* context
    -std::optional<int64_t> width
    -std::optional<int64_t> height
    -int64_t texture_id_
    -flutter::PluginRegistrarWindows registrar_
    -ThreadPool* thread_pool_ref_
    -std::unordered_map<int64_t, std::unique_ptr<flutter::TextureVariant>> texture_variants_
    -std::unique_ptr<ANGLESurfaceManager> surface_manager_ HW
    -std::unordered_map<int64_t, std::unique_ptr<FlutterDesktopGpuSurfaceDescriptor>> textures_ HW
    -std::unique_ptr<uint8_t[]> pixel_buffer_ SW
    -std::unordered_map<int64_t, std::unique_ptr<FlutterDesktopPixelBuffer>> pixel_buffer_textures_ SW
    -std::function texture_update_callback_

    +SetTextureUpdateCallback(callback: std::function<void(int64_t, int64_t, int64_t)>)
    -Render()
    -CheckAndResize()
    -Resize(required_width: int64_t, required_height: int64_t)
    -GetVideoWidth(): int64_t
    -GetVideoHeight(): int64_t
  }

  class ANGLESurfaceManager {
    +Β«getΒ» width: int32_t
    +Β«getΒ» height: int32_t
    +Β«getΒ» handle: HANDLE

    +HandleResize(width: int32_t, height: int32_t)
    +Draw(draw_callback: std::function<void()>)
    +Read()
    +SwapBuffers()
    +MakeCurrent(value: bool)
    -CreateEGLDisplay()
    -Initialize()
    -InitializeD3D11()
    -InitializeD3D9()
    -CleanUp(release_context: bool)
    -CreateEGLDisplay()
    -ShowFailureMessage(message: wchar_t[])

    -IDXGIAdapter* adapter_
    -int32_t width_
    -int32_t height_
    -ID3D11Device* d3d_11_device_
    -ID3D11DeviceContext* d3d_11_device_context_
    -Microsoft::WRL::ComPtr<ID3D11Texture2D>
    -Microsoft::WRL::ComPtr<IDXGISwapChain>
    -IDirect3D9Ex* d3d_9_ex_
    -IDirect3DDevice9Ex* d3d_9_device_ex_
    -IDirect3DTexture9* d3d_9_texture_
    -HANDLE handle_
    -EGLSurface surface_
    -EGLDisplay display_
    -EGLContext context_
    -EGLConfig config_
  }
Loading

Linux

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Takes FlTextureRegistrar as reference
  VideoOutput "1" *-- "1" TextureGL: For H/W rendering.
  TextureGL "1" o-- "1" VideoOutput: Take VideoOutput as reference
  VideoOutput "1" *-- "1" TextureSW: For S/W rendering.
  TextureSW "1" o-- "1" VideoOutput: Take VideoOutput as reference
  TextureGL "1" <-- "1" FlTextureGL
  TextureSW "1" <-- "1" FlTexture

  class MediaKitVideoPlugin {
    -FlMethodChannel* channel
    -VideoOutputManager* video_output_manager
  }

  class VideoOutputManager {
    -GHashTable* video_outputs
    -FlTextureRegistrar* texture_registrar
    +video_output_manager_create(self: VideoOutputManager*, handle: gint64, width: gint64, height: gint64, texture_update_callback: TextureUpdateCallback, texture_update_callback_context: gpointer)
    +video_output_manager_dispose(self: VideoOutputManager*, handle: gint64)
  }

  class VideoOutput {
    -TextureGL* texture_gl
    -GdkGLContext* context_gl
    -mpv_handle* handle
    -mpv_render_context* render_context
    -gint64 width
    -gint64 height
    -TextureUpdateCallback texture_update_callback
    -gpointer texture_update_callback_context
    -FlTextureRegistrar* texture_registrar
    +video_output_set_texture_update_callback(self: VideoOutput*, texture_update_callback: TextureUpdateCallback, texture_update_callback_context: gpointer)
    +video_output_get_render_context(self: VideoOutput*): mpv_render_context*
    +video_output_get_width(self: VideoOutput*): gint64
    +video_output_get_height(self: VideoOutput*): gint64
    +video_output_get_texture_id(self: VideoOutput*): gint64
    +video_output_notify_texture_update(self: VideoOutput*);
  }

  class TextureGL {
    -guint32 name
    -guint32 fbo
    -guint32 current_width
    -guint32 current_height
    -VideoOutput* video_output
    texture_gl_populate_texture(texture: FlTextureGL*, target: guint32*, name: guint32*, width: guint32*, height: guint32*, error: GError**): gboolean
  }

  class TextureSW {
    -guint32 current_width
    -guint32 current_height
    -VideoOutput* video_output
    texture_sw_copy_pixels(texture: FlPixelBufferTexture*, buffer: const uint8_t**, width: uint32_t*, height: uint32_t*, error: GError**): gboolean
  }
Loading

macOS

TODO: documentation

iOS

TODO: documentation

Android

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  MediaKitVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Create VideoOutput(s) to send back id & wid for render. Dispose to release.
  VideoOutput <.. MediaKitAndroidHelper: Create & dispose JNI global object reference to android.view.Surface (for --wid)
  MediaKitVideoPlugin <.. MPVLib: Initialize libmpv for usage on Android. Taken from mpv for Android.
  
  class MediaKitVideoPlugin {
    -MethodChannel channel
    -VideoOutputManager videoOutputManager
  }
  
  class VideoOutputManager {
    -HashMap<Long, VideoOutput> videoOutputs
    -TextureRegistry textureRegistryReference
    -Object lock
    
    +create(long handle): VideoOutput
    +dispose(long handle): void
  }
  
  class VideoOutput {
    -Surface surface
    -TextureRegistry.SurfaceTextureEntry surfaceTextureEntry
    
    +long id
    +long wid
    
    +dispose()
  }
  
  class MediaKitAndroidHelper {
    +newGlobalObjectRef(obj: Object): long
    +deleteGlobalObjectRef(ref: long): void
    +setApplicationContext(context: Context): void
    +copyAssetToExternalFilesDir(assetName: String): String
  }
  
  class MPVLib {
    +create(context: Context): context
  }
  
Loading

Implementation

libmpv is used for leveraging audio & video playback. It seems the best possible option since supports a wide variety of audio & video formats, provides hardware acceleration & bundle size is also minimal (select only required decoders etc. in FFmpeg/mpv).

Another major advantage is that large part of implementation (80%+) is shared across platforms using FFI. This makes the behavior of package very-very similar on all supported platforms & makes maintenance easier (since there is less code & most of it within Dart).

Alternative backends may be implemented in future to meet certain demands (& project architecture makes it possible).

package:media_kit

package:media_kit is entirely written in Dart. It uses dart:ffi to invoke native C API of libmpv through it's shared libraries. All the callback management, event-Streams, other methods to control playback of audio/video are implemented in Dart with the help of FFI. Event management i.e. position, duration, bitrate, audioParams Streams are important to render changes in the UI.

A big limitation with FFI in Dart SDK has been that it does not support async callbacks from another thread. Learn more about this at: dart/sdk#37022. Following situation will explain better:

If you pass a function pointer from Dart to C code, you can invoke it fine. But, as soon as you invoke it from some other thread on the native side, Dart VM will instantly crash. This feature is important because most events take place on a background thread.

However, I could easily do this within Dart because libmpv offers an "event polling"-like way to listen to events. I got awesome idea to spawn a background Isolate, where I run the event-loop. I get the memory address of each event and forward it outside the Isolate with the help of ReceivePort, where I finally interpret it using more FFI code. I have explained this in detail within the in-code comments of initializer.dart, where I had to perform a lot more trickery to get this to work.

Thus, invoking native methods & handling of events etc. could be done within 100% Dart using FFI. This is enough for audio playback & supports both Flutter SDK & Dart VM. Although event handling works entirely within Dart. Later, it was discovered that going beyond certain number of simultaneous instances caused a deadlock (dart-lang/sdk#51254 & dart-lang/sdk#51261), making UI entirely freezed along-side any other Dart code in execution. To deal with this, a new package package:media_kit_native_event_loop is created. Adding package:media_kit_native_event_loop to pubspec.yaml automatically resolves this issue without any chagnes to code!

However, no such "event-polling" like API is possible for video rendering. So, I best idea seemed to create a new package package:media_kit_video for specifically offering platform-specific video embedding implementation which internally handles Flutter's Texture Registry API & libmpv's OpenGL rendering API. This package only consumes the mpv_handle* (which can be shared as primitive int value easily) of the instance (created with package:media_kit through FFI) to setup a new viewport. Detailed implementation is discussed below.

package:media_kit_native_event_loop

Platform specific threaded event handling for media_kit. Enables support for higher number of concurrent instances.

The package contains a minimal C++ implementation which spawns a detach-ed std::thread. This runs the mpv_wait_event loop & forwads the events using postCObject, SendPort & ReceivePort to Dart VM. Necessary mutex synchronization also takes place.

Isolate based event loop is avoided once this package is added to the project.

package:media_kit_video

Windows

The two APIs above are hardware accelerated i.e. GPU backed buffers are used. This is performant approach, easily capable for rendering 4K 60 FPS videos, rest depends on the hardware. Since libmpv API is OpenGL based & the Texture API in Flutter is Direct3D based, ANGLE (Almost Native Graphics Layer Engine) is used for interop, which translates the OpenGL ES 2.0 calls into Direct3D.

This hardware accelerated video output requires DirectX 11 or higher. Most Windows systems with either integrated or discrete GPUs should support this already. On systems where Direct3D fails to load due to missing graphics drivers or unsupported feature-level or DirectX version etc. a fallback pixel-buffer based software renderer is used. This means that video is rendered by CPU & every frame is copied back to the RAM. This will cause some redundant load on the CPU, result in decreased battery life & may not play higher resolution videos properly. However, it works well.

Windows 7 & 8.x also work correctly.

0 1

You can visit my experimentation repository to see a minimal example showing OpenGL ES usage in Flutter Windows.

Linux

On Flutter Linux, both OpenGL (H/W) & pixel buffer (S/W) APIs are available for rendering on Texture widget.

macOS

On macOS the current implementation is based on libmpv and can be summarized as follows:

  1. H/W video decoding: mpv option hwdec is set to auto, does not depend on a pixel buffer.
  2. OpenGL rendering to an OpenGL texture backed by a pixel buffer, which makes it interoperable with METAL (CVPixelBuffer)

iOS

iOS shares much of it's implementation with macOS. Only difference is that OpenGL ES is used instead of OpenGL.

Android

On Android, texture registry API is based on android.graphics.SurfaceTexture.

libmpv can render directly onto an android.view.Surface after setting --wid. Creation of a new android.view.Surface requires reference to an existing android.graphics.SurfaceTexture, which can be consumed from the texture entry created by Flutter itself.

This requires --hwdec=mediacodec for hardware decoding, along with --vo=mediacodec_embed and --wid=(intptr_t)(*android.view.Surface).

More details may be found at: https://mpv.io/manual/stable/#video-output-drivers-mediacodec-embed

Obtaining a global reference pointer to a Java object (android.view.Surface in our case) requires JNI. For this, a custom shared library is used, you can find it's implementation at media-kit/media-kit-android-helper. Since compilation of this would require NDK (& make process tedious), pre-built shared libraries is bundled for each architecture at the time of development/build.

Since the package:media_kit is a Dart package (which works independent of Flutter), accessing assets was a challenging part. The mentioned shared libraries generated by media-kit/media-kit-android-helper helps to access assets bundled inside Android APK from Dart (using FFI, without depending on Flutter).

License

Copyright Β© 2021 & onwards, Hitesh Kumar Saini <[email protected]>

This project & the work under this repository is governed by MIT license that can be found in the LICENSE file.

About

[WIP] A complete 🎬 video & 🎡 audio library for Flutter & Dart.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Dart 51.1%
  • C++ 21.4%
  • Swift 8.3%
  • CMake 7.5%
  • Ruby 3.8%
  • Java 3.7%
  • Other 4.2%