Planet VideoLAN

Welcome on Planet VideoLAN. This page gathers the blogs and feeds of VideoLAN's developers and contributors. As such, it doesn't necessarly represent the opinion of all the developers, the VideoLAN project, ...

VLC for Android

July 18, 2018

VLC for iOS and UWP 3.1.0 release

Jean-Baptiste Kempf

VLC 3.1.0 release

After a few months since the release of VLC 3.0, today we release VLC 3.1.0 on 2 mobile OSes: iOS and Windows Store (UWP).

This release brings ChromeCast integration to iOS and UWP, like it was present on desktop and Android versions.

ChromeCast and hardware encoding

However, it supports ChromeCast in a more performant way, because we added hardware encoders to those 2 platforms.
Indeed, here, for local streaming, we care more about speed and battery saving than we care about bandwidth efficiency, si hardware encoding is a good fit.

On iOS, we're using the standard VideoToolbox hardware encoding to produce H.264 streams, muxed in MKV.

On UWP, we're using Quick Sync Video for intel CPUs (that covers almost all CPUs since 3rd core generation).

In fact, VLC has a QSV encoder since 2013, but it's very rarely used, because people usually prefer software encode (x264). Here, we fixed it and modified it to work inside the UWP sandbox.


You should really read Caro's blogpost here!

But in that version you have:

  • ChromeCast,
  • 360 video support, with sensors,
  • Numerous bugfixes on the playback core (inherited mostly from VLC 3.0.1-3.0.3)
  • Some decoding speed improvements,
  • Quite a few interface bugs (see 3.1.0 milestone)


The version is similar to the iOS version, in the fact that it has hardware encoding and ChromeCast integration.

As explained, the hardware encoding is done using QSV.

But it features also a large rework of the codebase and fixes a very large number of crashes.

Also, funnily enough, we've worked on the 8.1 version too, and we will push that one soon on the store. This includes SurfaceRT devices, even if Microsoft has forgotten them!

So VLC 3.1.0, UWP version will be out for:

  • Windows 10 Desktop (x86)
  • XBox One
  • Windows 10 Mobile (ARM)
  • Windows 8.1 Desktop (x86)
  • Windows 8.1 RT (ARM)

Once we fixed an issue, we might even do Windows Phone 8.1.

The Windows 10 versions are on the store today, and we're waiting for a deployment issue to be fixed to push the 8.1 versions!

(Note: if you are from Windows Central, you can contact me for more details)

Have fun!

July 18, 2018 10:06 PM

Welcome back!

After quite a bit of time far from the blog, I am back around here.

The biggest reason for this silence was that this was taking a lot of my time, but I had almost no positive feedback on those posts.

Let's see if we can do better this time :)

Here is a small cone, to make you more happy:

Large Cone

July 18, 2018 09:21 PM

July 13, 2018

Modern concurrency on Android with Kotlin

Geoffrey Métais

Current Java/Android concurrency framework leads to callback hells and blocking states because we do not have any other simple way to guarantee thread safety.

With coroutines, kotlin brings a very efficient and complete framework to manage concurrency in a more performant and simple way.

Suspending vs blocking

Coroutines do not replace threads, it’s more like a framework to manage it.
Its philosophy is to define an execution context which allows to wait for background operations to complete, without blocking the original thread.

The goal here is to avoid callbacks and make concurrency easier.

Basic usage

Very simple first example, we launch a coroutine in the UI context. In it, we retrieve an image from the IO one, and process it back in UI.

launch(UI) {
    val image = withContext(IO) { getImage() } // Get from IO context
    imageView.setImageBitmap(image) // Back on main thread

Staightforward code, like a single threaded function. And while getImage runs in IO dedicated thread, the main thread is free for any other job! withContext function suspends the current coroutine while its action (getImage()) is running. As soon as getImage() returns and main looper is available, coroutine resumes on main thread, and imageView.setImageBitmap(image) is called.

Second example, we now want 2 background works done to use them. We will use the async/await duo to make them run in parallel and use their result in main thread as soon as both are ready:

val job = launch(UI) {
    val deferred1 = async { getFirstValue() }
    val deferred2 = async(IO) { getSecondValue() }
    useValues(deferred1.await(), deferred2.await())

job.join() // suspends current coroutine until job is done

async is similar to launch but returns a deferred (which is the Kotlin equivalent of Future), so we can get its result with await(). Called with no parameter, it runs in CommonPool context.

And once again, the main thread is free while we are waiting for our 2 values.

As you can see, launch funtion returns a Job that can be used to wait for the operation to be over, with the join() function. It works like in any other language, except that it suspends the coroutine instead of blocking the thread.


Dispatching is a key notion with coroutines, it’s the action to ‘jump’ from a thread to another one.

Let’s look at our current java equivalent to UI dispatching, which is runOnUiThread:

public final void runOnUiThread(Runnable action) {
    if (Thread.currentThread() != mUiThread) {; // Dispatch
    } else {; // Immediate execution

Android implementation of UI context is a dispatcher based on a Handler. So this really is the matching implementation:

launch(UI) { ... }
launch(UI, CoroutineStart.UNDISPATCHED) { ... }

launch(UI) posts a Runnable in a Handler, so its code execution is not immediate.
launch(UI, CoroutineStart.UNDISPATCHED) will immediately execute its lambda expression in the current thread.

UI guarantees that coroutine is dispatched on main thread when it resumes, and it uses a Handler as the native Android implementation to post in the application event loop.

See its actual implementation:

val UI = HandlerContext(Handler(Looper.getMainLooper()), "UI")

To get a better understanding of Android dispatching, you can read this blog post on Understanding Android Core: Looper, Handler, and HandlerThread.

Coroutine context

A couroutine context (aka coroutine dispatcher) defines on which thread its code will execute, what to do in case of thrown exception and refers to a parent context, to propagate cancellation.

val job = Job()
val exceptionHandler = CoroutineExceptionHandler {
    coroutineContext, throwable -> whatever(throwable)

launch(CommonPool+exceptionHandler, parent = job) { ... }

job.cancel() will cancel all coroutines that have job as a parent. And exceptionHandler will receive all thrown exceptions in these coroutines.


  • Coroutines limit Java interoperability
  • Confine mutablility to avoid locks
  • Coroutines are for threading waiting
    • Avoid I/O in CommonPool (and UI…)
    • SharedPool dispatcher coming soon to improve this
  • Threads are expensive, so are single-thread contexts
  • CommonPool is based on a ForkJoinPool on Android 5+
  • Coroutines can be used via Channels

CommonPool is a threadpool, aimed to be intensively used. If you perform I/O tasks in it, you could get all its threads blocked at the same time and any coroutine relying on it will be waiting.
JetBrains is adressing this issue and will probably release a shared pool guarantying that at least one thread is always free from I/O operations.
For now, it’s important to keep it free from long tasks and execute them in dedicated threads/contexts, like:

val IO = ThreadPoolExecutor(0, Integer.MAX_VALUE, 60L,
        TimeUnit.SECONDS, SynchronousQueue<Runnable>()

Callbacks and locks elimination with channels

Channel definition from JetBrain documentation:

A Channel is conceptually very similar to BlockingQueue. One key difference is that instead of a blocking put operation it has a suspending send, and instead of a blocking take operation it has a suspending receive.


Let’s start with a simple tool to use Channels, the Actor.

We already saw it in this blog with the DiffUtil kotlin implementation.

Actor is, yet again, very similar to Handler: we define a coroutine context (so, the tread where to execute actions) and it will execute it in a sequencial order.

Difference is it uses coroutines of course :), we can specify a capacity and executed code can suspend.

An actor will basically forward any order to a coroutine Channel. It will guaranty the order execution and confine operations in its context. It greatly helps to remove synchronize calls and keep all threads free!

protected val updateActor by lazy {
    actor<Update>(UI, capacity = Channel.UNLIMITED) {
        for (update in channel) when (update) {
            Refresh -> updateList()
            is Filter -> filter.filter(update.query)
            is MediaUpdate -> updateItems(update.mediaList as List<T>)
            is MediaAddition -> addMedia( as T)
            is MediaListAddition -> addMedia(update.mediaList as List<T>)
            is MediaRemoval -> removeMedia( as T)
// usage
suspend fun filter(query: String?) = updateActor.offer(Filter(query))

In this example, we take advantage of the Kotlin sealed classes feature to select which action to execute.

sealed class Update
object Refresh : Update()
class Filter(val query: String?) : Update()
class MediaAddition(val media: Media) : Update()

And all this actions will be queued, they will never run in parallel. That’s a good way to achieve mutability confinement.

Android lifecycle + Coroutines

(Sample shamefully copied from JetBrain’s Guide to UI programming with coroutines)

Actors can be profitable for Android UI management too, they can ease tasks cancellation and prevent overloading of the UI thread.

Let’s first declare a JobHolder interface, which will be applied to our Activity. This job will be used as a parent for any user triggered task, and will allow their cancellation.

interface JobHolder {
    val job: Job

Let’s implement it and call job.cancel() when activity is destroyed.

class MyActivity : AppCompatActivity(), JobHolder {
    override val job: Job = Job() // the instance of a Job for this activity

    override fun onDestroy() {
        job.cancel() // cancel the job when activity is destroyed

A bit better, with an extension function, we can make this Job accessible from any View of a JobHolder

val View.contextJob: Job
    get() = (context as? JobHolder)?.job ?: NonCancellable

We can now combine all this, setOnClick function creates a conflated actor to manage its onClick actions. In case of multiple clicks, intermediates actions will be ignored, preventing any ANR, and these actions will be executed in a context with contextJob as a parent. So it will be cancelled when Activity is destroyed 😎

fun View.setOnClick(action: suspend () -> Unit) {
    // launch one actor as a parent of the context job
    val eventActor = actor<Unit>(context = UI,
                start = CoroutineStart.UNDISPATCHED,
                capacity = Channel.CONFLATED,
                parent = contextJob) {
        for (event in channel) action()
    // install a listener to activate this actor
    setOnClickListener { eventActor.offer(Unit) }

In this example, we set the Channel as Conflated to ignore events when we have too much of them. You can change it to Channel.UNLIMITED if you prefer to queue events without missing anyone of them, but still protect your app from ANR

We also can combine coroutines and Lifecycle frameworks to automate UI tasks cancellation:

val LifecycleOwner.untilDestroy: Job get() {
    val job = Job()

    lifecycle.addObserver(object: LifecycleObserver {
        fun onDestroy() { job.cancel() }

    return job
launch(UI, parent = untilDestroy) { /* amazing things happen here! */ }

Callbacks mitigation (Part 1)

Example of a callback based API use transformed thank to a Channel.

API works like this:

  1. requestBrowsing(url, listener) triggers the parsing of folder at url address.
  2. The listener receives onMediaAdded(media: Media) for each discovered media in this folder.
  3. listener.onBrowseEnd() is called once folder parsing is done.

Here is the old refresh function in VLC browser provider:

private val refreshList = mutableListOf<Media>()

fun refresh() = requestBrowsing(url, refreshListener)

private val refreshListener = object : EventListener{
    override fun onMediaAdded(media: Media) {
    override fun onBrowseEnd() {
        val list = refreshList.toMutableList()
        launch(UI) {
            dataset.value = list

How to improve this?

We create a channel, which will be initiated in refresh. Browser callbacks will now only forward media to this channel then close it.

Refresh function is now easier to understand. It sets the channel, calls the VLC browser then fills a list with the media and processes it.

Instead of the select or consumeEach functions, we can use for to wait for media and it will break once browserChannel is closed

private lateinit var browserChannel : Channel<Media>

override fun onMediaAdded(media: Media) {

override fun onBrowseEnd() {

suspend fun refresh() {
    browserChannel = Channel(Channel.UNLIMITED)
    val refreshList = mutableListOf<Media>()
    //Suspends at every iteration to wait for media
    for (media in browserChannel) refreshList.add(media)
    //Channel has been closed
    dataset.value = refreshList

Callbacks mitigation (Part 2): Retrofit

Second approach, we don’t use kotlinx-coroutines at all but the coroutine core framework.
Let’s see how coroutines really work!

retrofitSuspendCall function wraps a Retrofit Call request to make it a suspend function.
With suspendCoroutine we call the Call.enqueue method and suspend the coroutine. The provided callback will call continuation.resume(response) to resume the coroutine with the server response once received.

Then, we just have to bundle our Retrofit functions in retrofitSuspendCall to have a suspending functions returning the requests result.

suspend inline fun <reified T> retrofitSuspendCall(request: () -> Call<T>
) : Response<T> = suspendCoroutine { continuation ->
    request.invoke().enqueue(object : Callback<T> {
        override fun onResponse(call: Call<T>, response: Response<T>) {
        override fun onFailure(call: Call<T>, t: Throwable) {

suspend fun browse(path: String?) = retrofitSuspendCall {

// usage (within UI coroutine context)
livedata.value = Repo.browse(path)

This way, the network blocking call is done in Retrofit dedicated thread, coroutine is here to wait for the response, and in-app usage couldn’t be simpler!

This implementation is inspired by gildor/kotlin-coroutines-retrofit library, which makes it ready to use.
JakeWharton/retrofit2-kotlin-coroutines-adapter is also available with another implementation, for the same result.

To be continued

Channel framework can be used in many other ways, you can look at BroadcastChannel for more powerful implementations according to your needs.
We can also create channels with the Produce function.
It can also be useful for communication between UI components: an adapter can pass click events to its Fragment/Activity via a Channel or an Actor for example.

Related readings:

July 13, 2018 12:00 AM

March 13, 2018

MobileVLCKit and VLCKit, part 3

Felix Paul Kühne

This is part of an article series covering VLC’s Objective-C framework, which we provide to allow inclusion of all its features in third party applications as well as VLC for iOS and Apple TV.

Previously published:

Today, we will discuss thumbnailing of video content. We need to differenciate two key aspects: saving still images of a currently playing video (snapshot) and previewing media stored somewhere without being played (thumbnail). While either way, VLCKit will open the resource, decode the bitstream and provide you with a image, performance and usability will differ.


Let’s start with thumbnailing a non playing media source, which can be stored locally or remotely.

@implementation DummyObject <VLCMediaThumbnailerDelegate>

- (void)workerMethod
    // 1
    NSURL *url = [NSURL urlWithString:@""];
    VLCMedia *media = [VLCMedia mediaWithURL:url];

    // 2
    VLCMediaThumbnailer *thumbnailer = [VLCMediaThumbnailer thumbnailerWithMedia:media delegate:self];

    // 3
    CGSize thumbSize = CGSizeMake(800.,600.);
    thumbnailer.thumbnailWidth = thumbSize.width;
    thumbnailer.thumbnailHeight = thumbSize.height;

    // 4
    [thumbnailer fetchThumbnail];

- (void)mediaThumbnailer:(VLCMediaThumbnailer *)mediaThumbnailer didFinishThumbnail:(CGImageRef)thumbnail
    // 5
    if (thumbnail) {
        UIImage *thumbnailImage = [UIImage imageWithCGImage:thumbnail scale:[UIScreen mainScreen].scale orientation:UIImageOrientationUp];
        if (thumbnailImage) {
            // TODO: do something with the thumbnail!

- (void)mediaThumbnailerDidTimeOut:(VLCMediaThumbnailer *)mediaThumbnailer
     // TODO: Show a reaction

  1. We need to create a NSURL instance along with its VLCMedia representation. Note that the URL may point to both a local or a remote resource.
  2. We create the thumbnailer instance for our media and point to ourselves as a delegate to receive the thumbnail.
  3. We define the size of the resulting thumbnail.  If width and height are set to zero, the video’s original size will be used. If you set either width or height to zero, the aspect-ratio is preserved.
  4. Finally, we call the thumbnailer’s worker function.
  5. Asynchronously, after about two to twenty seconds, we will receive a response from the thumbnailer to the delegate. It is important to check the thumbnail for NULL before trying to bridge it to a UIImage or NSImage as well as afterwards as the translation can fail. That’s all.

You might be wondering how the thumbnailer decides which frame to return. This is based on a more complex algorithm currently depending on the media’s duration and availability of key frames. Future versions may also analyze the image content.
You can overwrite this algorithm with the thumbnailer’s snapshotPosition property (with a 0.0 to 1.0 range).


The VLCMediaPlayer class includes a very basic API, which allows the creation of an infinite number of snapshots during playback, which will be asynchronously stored as local files. The size parameters follow the same pattern as for the thumbnailer.

- (void)workerMethod
    // ...
    [_mediaplayer saveVideoSnapshotAt:(NSString *)path withWidth:(int)width andHeight:(int)height];
    // ...

As soon as the snapshot was stored, a VLCMediaPlayerSnapshotTaken notification is emitted and mediaPlayerSnapshot: is called on the media player’s delegate. Note that the delegate call is available on iOS and tvOS only.
As a convenience starting in VLCKit 3.0 on iOS and tvOS, the media player class exposes the lastSnapshot and snapshots properties, which provide a UIImage instance of the last shot as well as a list of files of the taken shots.

That’s all for today. Enjoy using VLCKit!

March 13, 2018 12:27 PM

February 19, 2018

VLCKit 3.0

Felix Paul Kühne

10 days ago, we published VLC media player 3.0 for all platforms. It’s the first major release in three years and brings a huge number of features, improvements and fixes. Get an overview here and the full changelog there.

For VLCKit, we improved performance and memory management, added new APIs and you get all improvements from the underlying libvlc including full support for decoding H264 and H265 using VideoToolbox in hardware. Instead of using all cores of your iPhones CPU at 100%, decoding a 4K video uses less than 20%.
Further you can look at all aspects of a 360° video with touch gesture based controls, discover and browse shares on your network with UPnP, NFS, FTP, SFTP and SMB and more.

As you remember, we published VLC for Apple TV in January 2016, but so far, we never made VLCKit available on tvOS. In addition to MobileVLCKit for iOS, we now introduce TVVLCKit for tvOS!

For macOS, iOS and tvOS, VLCKit 3.0 is available through Cocoapods as a precompiled binary under the LGPLv2.1 license. You can find the source code on our website – contributions welcome!

We are looking forward to all your feedback and the apps deploying VLCKit to deliver multimedia to their users.
Do you want to learn more about integrating VLCKit? Have a look at the tutorials I wrote not too long ago (Part 1, Part 2).

So what did we change in VLCKit, API-wise?

New APIs:
- VLCAudio
 - setMuted:

- VLCDialogProvider
 - new class to handle user interaction with VLC events

- VLCLibrary
 - added properties: debugLogging, debugLoggingLevel

- VLCMediaDiscoverer
 - added selector: availableMediaDiscovererForCategoryType:
 - added enum: VLCMediaDiscovererCategoryType

- VLCMediaListPlayer
 - added selectors:

- VLCMediaPlayer
 - added properties:
 - added selectors:
 - added notifications: VLCMediaPlayerTitleChanged, VLCMediaPlayerChapterChanged
 - added enum: VLCMediaPlaybackSlaveType
 - play's return type was changed from BOOL to void
 - hue is now a float instead of an integer
 - Return value of the following methods changed from INT_MAX to -1

- VLCMedia
 - added keys: VLCMetaInformationTrackTotal, VLCMetaInformationDirector,
 VLCMetaInformationSeason, VLCMetaInformationEpisode,
 VLCMetaInformationShowName, VLCMetaInformationActors,
 VLCMetaInformationAlbumArtist, VLCMetaInformationDiscNumber,
 - added selectors:
 - added enums: VLCMediaType, VLCMediaParsingOptions, VLCMediaParsedStatus, VLCMediaOrientation, VLCMediaProjection
 - changed behavior: media will no longer be parsed automatically if meta data is requested prior to concluded parsing

- VLCMediaList
 - changed behavior: lists of media objects added through arrays or on init are no longer added in reverse order

- VLCTime
 - added selectors:

- VLCAudio
 - added property: passthrough

Modified APIs:
- VLCMediaList
 - To match the KVC bindings, all NSInteger arguments were moved to NSUInteger as appropriate
 - mediaList:mediaAdded:atIndex:
 - mediaList:mediaRemovedAtIndex:
 - addMedia:
 - insertMedia:atIndex:
 - removeMediaAtIndex:
 - mediaAtIndex:

Deprecated APIs:
- VLCAudio
 - setMute:
- VLCMedia
 - parse, isParsed, synchronousParse
- VLCMediaDiscoverer
 - availableMediaDiscoverer, localizedName
- VLCMediaPlayer
 - titles, chaptersForTitleIndex:, countOfTitles, framesPerSecond, openVideoSubTitlesFromFile:
- VLCMediaListPlayer
 - playItemAtIndex
- VLCStreamSession
- VLCStreamOutput
- VLCMediaLibrary

Removed APIs:
- VLCExtension
- VLCExtensionsManager
- VLCMedia:
 - fps
 - media:metaValueChangedFrom:forKey:
- VLCMediaPlayer
 - audioTracks
 - videoTracks
 - videoSubTitles
- VLCServicesDiscoverer
- VLCPlaylistDataSource

February 19, 2018 03:22 PM

February 09, 2018

Announcing VLC 3.0 release!

Geoffrey Métais

Version 2.5 has been a nice upgrade for VLC on Android. Now it’s been stabilized, and we are finally shipping the long awaited version 3.0!

VLC 3.0 is the first ever synchronized release between desktop application and mobile ports. Today, VLC is released everywhere, with the same version number and at the same time. It will be simpler for everyone, including VideoLAN developers 😊

On Android, this release mainly brings Chromecast feature, but it also repairs some lacks in priors versions. VLC on Android becomes more complete and it will continue!


Stop vertical videos now! Turn your phone horizontally when you record your children, because you’re going to show them to the family on the big screen now 📺

Chromecast support is finally here. As soon as a Chromecast is detected by VLC, you can send it a video or audio media and enjoy watching it!
If media codecs are supported by your Chromecast device, VLC only acts as a streaming server (which is battery consuming). If not, VLC will transcode and stream media, which is highly cpu and battery consuming.

Chromcast remote

Please consider Chromecast support beta for now, we will work on hardening it in the upcoming weeks thanks to your feedbacks.

VLC everywhere

VLC for Android is also available on different Android platforms like DeX, Chromebooks and Android auto.


You can now drop media files on VLC from other applications and use the right click on VLC media to get the context menu.

Chrome OS

Android Auto allows you to easily command VLC with a simplified UI or even by voice while driving. You can ask “play Daft Punk (with VLC)” and Google Assistant will recognize whether it’s an artist, an album or a song you’re asking for and tell VLC to play it.

auto player

Playlist files

Version 2.5 suffered from a painful regression: The lack of playlist file support. This was due to the migration to our new multiplatform medialibrary. This is now fixed with this update, VLC can scan your .m3u again and show your playlists.

Features Catch-Up

Delete is back

VLC 2.5 had troubles dealing with delete on internal storage on Oreo, this is now resolved and new permissions access is well managed.

But the big news is the support of media deletion on external devices for Android Lollipop+ device. All you have to do is to select the sdcard/USB key in the awful Google dialog and then VLC can delete any file in it!
For this special process, we prepared a small tutorial in the app which looks like this: sdcard wizard

Fast seek

‘Fast seek’ VLC option is now activated by default. VLC will now load faster when you change the current position during media playback. This can be deactivated in Settings → Video → ‘Enable fast seek option’

Subs (not) auto-loading

Not everyone wants subtitles, and we did not focus on this because most of the VideoLAN developers are European. But now you can deactivate automatic subtitles loading in Settings → Subtitles → ‘Auto load subtitles’.

Enjoy your videos without distraction now!

Why chromecast support took so long

Chromecast support is everywhere and VLC took years to get it, right, but there are plenty of good reasons for it:

First of all, VideoLAN is a nonprofit organization and not a company. There are few developers paid for making VLC, most of them do it in their free time. That’s how you get VLC for free and without any ads!

Also, VLC is 100% Open Source and Chromecast SDK isn’t: We had to develop our very own Chromecast stack by ourselves. This is also why there is no voice actions for VLC (except with Android Auto), we cannot use Google Play Services.

Furthermore, Chromecast is not designed to play local video files: When you watch a Youtube video, your phone is just a remote controller, nothing more. Chromecast streams the video from
That’s where it becomes complicated, Chromecast only supports very few codecs number, let’s say h264. Google ensures that your video is encoded in h264 format on, so streaming is simple.
With VLC, you have media of any format. So VLC has to be a http server like, and provide the video in a Chromecast compatible format. And of course in real time, which is challenging on Android because phones are less powerful than computers.

At last, VLC was not designed to display a video on another screen. It took time to properly redesign VLC to nicely support it. The good news is we did not make a Chromecast specific support, it is generic renderers: in the next months we can add UPnP support for example, to cast on any UPnP box or TV!

February 09, 2018 12:00 AM