1.14.0.5
Hummingbird
A modern user interface library for games
Video support

Hummingbird can play video/audio through the standard <video> element like so:

<video width="320" height="240" src="my_movie.webm">
</video>

The video player will:

  • play opaque videos
  • play transparent videos
  • send any and all audio data to the engine for playing (it won't play it by itself)
  • currently work only with videos
    • in the webm format
    • with video tracks encoded with VP8 or VP9
    • with audio tracks encoded in Vorbis

The standard attributes autoplay, loop and muted are also supported.

Distribution

The video player is distributed in a separate dynamic library to reduce binary size for users who don't need it. The separate library is called MediaDecoders.[PlatformName].[PlatformExtension] (e.g. MediaDecoders.WindowsDesktop.dll on Windows). You need to place that extra binary next to core cohtml library in your distribution package and it will be automatically loaded.

Distribution on iOS

iOS doesn't support dynamic libraries so it's an exception to the rule above. Instead of placing a dynamic library, on iOS you'll need to statically link to either libMediaDecoders.iOS.a or libMediaDecodersEmpty.iOS.a depending on whether you want to use the video feature or not.

How to encode videos

Although we support both VP8 and VP9, VP8 is much faster both for decoding and encoding, we recommend it. You can experiment with your own videos by encoding them through "ffmpeg". To encode a video, please download the ffmpeg-encoder from here The command line that gives best results is:

ffmpeg -i VideoIn.mp4 -vcodec libvpx -b:v 1000k -s 1280x720 -acodec libvorbis -b:a 128k -ar 48000 -ac 2 VideoOut.webm

You can set the resolution and bitrate. Please make sure you use the other parameters as described in the ffmpeg docs because they affect the decoding and encoding performance.

Multithreaded decoding

Video decoding always happen through Hummingbird's task system. For this to happen, you need to hook up the task system with your own engine's tasks through cohtml::LibraryParams::OnWorkAvailable.

You can further speed up decoding by telling the SDK how many worker threads can serve for decoding purposes. This is done by specifying a value to cohtml::LibraryParams::ResourceThreadsCountHint.

Warning
If you set cohtml::LibraryParams::ResourceThreadsCountHint to a number higher than the actual number of threads serving the task queue, Hummingbird is very likely to enter a deadlock. Always set cohtml::LibraryParams::ResourceThreadsCountHint to a number less than or equal to the actual thread count.

Audio support

Hummingbird does not play audio by itself. All audio data is decoded, converted to Pulse-code modulation (PCM) and passed to the engine for further processing. The PCM data is passed through several callbacks on the cohtml::IViewListener interface (look for the OnAudio* methods)

You can use your engine's audio system to enqueue the PCM data in the sound buffers and get it playing. There are two reference implementation available - one based on Windows' XAudio2 and one on OpenAL. Both can be found under Modules/AudioSystem/. The Audio system module provides an abstraction over both implementations and can also be used directly in the engine by including the source file and linking to the corresponding third party dependencies.

Take a look at the Sample_VideoPlayer sample in the distribution package for more info.

Showing video preview {\#VideoPreview}

By default, Hummingbird does not render video frames unless playback is explicitly required. This means that there will be no image of the video shown before you actually play it. If you want to show a preview, you must play a very small portion of it.

That is, explicitly play the video and pause it almost immediately. This can be done by listening for the playing event of the video, which is fired after the video started playing, and then request a pause in the callback. Here's an example of doing that:

<script>
function showFirstFrame(vid) {
vid.play();
requestAnimationFrame(function () {
vid.pause();
});
}
var myVideo = document.getElementById("myVideo");
showFirstFrame(myVideo);
</script>

Resource handler

When playing video, the SDK will load the file through your implementation of cohtml::IAsyncResourceHandle::OnResourceRequest. If your implementation loads the entire file in memory, you can easily go out of memory on platforms with limited hardware. To avoid that, make sure you only load the chunks of the file that Hummingbird actually requested.

Make sure you check the reference implementation in the class resource::ResourceHandler, that is used all across the samples.

Transparent video support

The basic authoring process of transparent videos goes roughly as follows:

  • Create video for chroma keying: Either record or create with some video editing tool a video which has one color assigned as "transparent" (key color).
  • Split the video into an image sequence: The video must be split into images that support transparency (e.g. PNGs). You can do this with most popular video editing software, or by using ffmpeg via the command line.

    Following is a ffmpeg example which removes pink color (0xFF00FF) through a filter and outputs an image sequence:

ffmpeg -i vid.mp4 -vf colorkey=0xFF00FF:0.3:0.8 sequence/image-%05d.png

The FFMPEG manual states this regarding the colorkey filter:

colorkey

RGB colorspace color keying.

The filter accepts the following options:

  • color

    The color which will be replaced with transparency.

  • similarity

    Similarity percentage with the key color.

    0.01 matches only the exact key color, while 1.0 matches everything.

  • blend

    Blend percentage.

    0.0 makes pixels either fully transparent, or not transparent at all.

    Higher values result in semi-transparent pixels, with a higher transparency the more similar the pixels color is to the key color.

In the example a similarity of 0.3 and blend of 0.8 are used.

  • Convert the image sequence back into a video: One of the easiest ways to do this is using FFMPEG:
ffmpeg -i sequence/image-%05d.png -c:v libvpx -pix_fmt yuva420p -metadata:s:v:0 alpha_mode="1" output.webm

If you want to add audio, you can simply pass another input to ffmpeg:

ffmpeg -i sequence/image-%05d.png -i audio.wav -c:v libvpx -pix_fmt yuva420p -metadata:s:v:0 alpha_mode="1" output.webm

Other parameters can be passed as well, of course. Add values for the video bitrate, resolution, etc. to control the video quality.

ffmpeg -i sequence/image-%05d.png -c:v libvpx -pix_fmt yuva420p -metadata:s:v:0 alpha_mode="1" -b:v 2000k -s 1920x1080 -frame-parallel 1 -tile-rows 0 -tile-columns 6 -slices 4 output.webm
Note
Make sure you're using a new version of FFMPEG, otherwise the colorkey filter might not be available.