Skip to content

Commit

Permalink
Add video filter support
Browse files Browse the repository at this point in the history
  • Loading branch information
DanielNovak committed Sep 20, 2023
1 parent 86bed9f commit aad2d66
Show file tree
Hide file tree
Showing 15 changed files with 518 additions and 53 deletions.
131 changes: 127 additions & 4 deletions docusaurus/docs/Android/06-advanced/05-apply-video-filters.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,132 @@ title: Video & Audio filters
description: How to build video or audio filters
---

## Apply Custom Video Filters
## Video Filters

// TODO - cover how to apply custom filters, where they live/exist and some common examples
Some calling apps allow filters to be applied to the current user's video, such as blurring the background, adding AR elements (glasses, moustaches, etc) or applying image filters (such as sepia, bloom etc). StreamVideo's Android SDK has support for injecting your custom filter into the calling experience.

// sepia
// grayscale
How does this work? You can inject a filter through `Call.videoFilter`, you will receive each frame of the user's local video as `Bitmap`, allowing you to apply the filters (by mutating the `Bitmap`). This way you have complete freedom over the processing pipeline.

## Adding a Video Filter

Create a `BitmapVideoFilter` or `RawVideoFilter` instance in your project. Here is how the abstract classes look like:

```kotlin
abstract class BitmapVideoFilter : VideoFilter() {
fun filter(bitmap: Bitmap)
}

abstract class RawVideoFilter : VideoFilter() {
abstract fun filter(videoFrame: VideoFrame, surfaceTextureHelper: SurfaceTextureHelper): VideoFrame
}
```
The `BitmapVideoFilter` is a simpler filter that gives you a `Bitmap` which you can then manipulate directly. But it's less performant than using the `RawVideoFilter` which gives you direct access to `VideoFrame` from WebRTC and there is no overhead compared to `BitmapVideoFilter` (like YUV <-> ARGB conversions).

And then set the video filter into `Call.videoFilter`.

We can create and set a simple black and white filter like this:
```kotlin
call.videoFilter = object: BitmapVideoFilter() {
override fun filter(bitmap: Bitmap) {
val c = Canvas(bitmap)
val paint = Paint()
val cm = ColorMatrix()
cm.setSaturation(0f)
val f = ColorMatrixColorFilter(cm)
paint.colorFilter = f
c.drawBitmap(bitmap, 0f, 0f, paint)
}
}
```

:::note
You need to manipulate the original bitmap instance to apply the filters. You can of course create a new bitmap in the process, but you need to then draw it on the `bitmap` instance you get in the `filter` callback
:::

## Adding AI Filters

In some cases, you might also want to apply AI filters. That can be an addition to the user's face (glasses, moustaches, etc), or an ML filter. In this section this use-case will be covered. Specifically, you will use the [Selfie Segmentation](https://developers.google.com/ml-kit/vision/selfie-segmentation/android) from Google's ML kit to change the background behind you.

First include the necessary dependency (check for latest version [here](https://developers.google.com/ml-kit/vision/selfie-segmentation/android#before_you_begin)

```gradle
dependencies {
implementation 'com.google.mlkit:segmentation-selfie:16.0.0-beta4'
}
```

Create a class that will hold your custom filter. The initialisation of the `Segmentation` class is done according to the [official docs](https://developers.google.com/ml-kit/vision/selfie-segmentation/android#enable_raw_size_mask).

```kotlin
class SelfieSegmentation {

private val options =
SelfieSegmenterOptions.Builder()
.setDetectorMode(SelfieSegmenterOptions.STREAM_MODE)
.enableRawSizeMask()
.build()
private val segmenter = Segmentation.getClient(options)

fun applyFilter(bitmap: Bitmap) {
// Send the bitmap into ML Kit for processing
val mlImage = InputImage.fromBitmap(bitmap, 0)
val task = segmenter.process(mlImage)
// Wait for result synchronously on same thread
val mask = Tasks.await(task)

val isRawSizeMaskEnabled = mask.width != bitmap.width || mask.height != bitmap.height
val scaleX = bitmap.width * 1f / mask.width
val scaleY = bitmap.height * 1f / mask.height

// Create a bitmap mask to cover the background
val maskBitmap = Bitmap.createBitmap(
maskColorsFromByteBuffer(mask), mask.width, mask.height, Bitmap.Config.ARGB_8888
)
// Create a canvas from the frame bitmap
val canvas = Canvas(bitmap)
val matrix = Matrix()
if (isRawSizeMaskEnabled) {
matrix.preScale(scaleX, scaleY)
}
// And now draw the bitmap mask onto the original bitmap
canvas.drawBitmap(maskBitmap, matrix, null)

maskBitmap.recycle()
}

private fun maskColorsFromByteBuffer(mask: SegmentationMask): IntArray {
val colors = IntArray(mask.width * mask.height)
for (i in 0 until mask.width * mask.height) {
val backgroundLikelihood = 1 - mask.buffer.float
if (backgroundLikelihood > 0.9) {
colors[i] = Color.argb(128, 0, 0, 255)
} else if (backgroundLikelihood > 0.2) {
val alpha = (182.9 * backgroundLikelihood - 36.6 + 0.5).toInt()
colors[i] = Color.argb(alpha, 0, 0, 255)
}
}
return colors
}
}
```

And now set the custom filter into our SDK:

```kotlin
call.videoFilter = object: BitmapVideoFilter() {

val selfieFilter = SelfieSegmentation()

override fun filter(bitmap: Bitmap) {
selfieFilter.applyFilter(bitmap)
}
}
```

The result:

![Stream Filter](../assets/screenshot_video_filter.png)

## Audio Filters

TODO
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
package io.getstream.video.android.ui.call

import android.app.Activity
import android.graphics.Bitmap
import android.media.projection.MediaProjectionManager
import android.widget.Toast
import androidx.activity.compose.rememberLauncherForActivityResult
Expand Down Expand Up @@ -45,7 +46,9 @@ import androidx.compose.ui.unit.dp
import androidx.compose.ui.window.Popup
import io.getstream.video.android.compose.theme.VideoTheme
import io.getstream.video.android.core.Call
import io.getstream.video.android.core.call.video.BitmapVideoFilter
import io.getstream.video.android.ui.common.R
import io.getstream.video.android.util.SampleVideoFilter
import kotlinx.coroutines.launch

@Composable
Expand Down Expand Up @@ -144,6 +147,34 @@ internal fun SettingsMenu(

Spacer(modifier = Modifier.height(12.dp))

Row(
modifier = Modifier.clickable {
if (call.videoFilter == null) {
call.videoFilter = object : BitmapVideoFilter() {
override fun filter(bitmap: Bitmap) {
SampleVideoFilter.toGrayscale(bitmap)
}
}
} else {
call.videoFilter = null
}
},
) {
Icon(
painter = painterResource(id = R.drawable.stream_video_ic_fullscreen_exit),
tint = VideoTheme.colors.textHighEmphasis,
contentDescription = null,
)

Text(
modifier = Modifier.padding(start = 20.dp),
text = "Toggle video filter",
color = VideoTheme.colors.textHighEmphasis,
)
}

Spacer(modifier = Modifier.height(12.dp))

if (showDebugOptions) {
Row(
modifier = Modifier.clickable {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,23 @@
* limitations under the License.
*/

package io.getstream.video.android.core.filter
package io.getstream.video.android.util

public interface AudioFilter
import android.graphics.Bitmap
import android.graphics.Canvas
import android.graphics.ColorMatrix
import android.graphics.ColorMatrixColorFilter
import android.graphics.Paint

object SampleVideoFilter {

fun toGrayscale(bmpOriginal: Bitmap) {
val c = Canvas(bmpOriginal)
val paint = Paint()
val cm = ColorMatrix()
cm.setSaturation(0f)
val f = ColorMatrixColorFilter(cm)
paint.colorFilter = f
c.drawBitmap(bmpOriginal, 0f, 0f, paint)
}
}
3 changes: 3 additions & 0 deletions gradle/libs.versions.toml
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ landscapist = "2.2.8"
accompanist = "0.30.1"
telephoto = "0.3.0"
audioswitch = "1.1.8"
libyuv = "0.28.0"

wire = "4.7.0"
okhttp = "4.11.0"
Expand Down Expand Up @@ -120,6 +121,8 @@ telephoto = { group = "me.saket.telephoto", name = "zoomable", version.ref = "te

audioswitch = { group = "com.twilio", name = "audioswitch", version.ref = "audioswitch"}

libyuv = { group = "io.github.crow-misia.libyuv", name = "libyuv-android", version.ref = "libyuv"}

wire-runtime = { group = "com.squareup.wire", name = "wire-runtime", version.ref = "wire" }
retrofit = { group = "com.squareup.retrofit2", name = "retrofit", version.ref = "retrofit" }
retrofit-moshi = { group = "com.squareup.retrofit2", name = "converter-moshi", version.ref = "retrofit" }
Expand Down
32 changes: 19 additions & 13 deletions stream-video-android-core/api/stream-video-android-core.api
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ public final class io/getstream/video/android/core/Call {
public final fun getState ()Lio/getstream/video/android/core/CallState;
public final fun getType ()Ljava/lang/String;
public final fun getUser ()Lio/getstream/video/android/model/User;
public final fun getVideoFilter ()Lio/getstream/video/android/core/call/video/VideoFilter;
public final fun goLive (ZZZLkotlin/coroutines/Continuation;)Ljava/lang/Object;
public static synthetic fun goLive$default (Lio/getstream/video/android/core/Call;ZZZLkotlin/coroutines/Continuation;ILjava/lang/Object;)Ljava/lang/Object;
public final fun grantPermissions (Ljava/lang/String;Ljava/util/List;Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
Expand Down Expand Up @@ -50,6 +51,7 @@ public final class io/getstream/video/android/core/Call {
public final fun sendReaction (Ljava/lang/String;Ljava/lang/String;Ljava/util/Map;Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public static synthetic fun sendReaction$default (Lio/getstream/video/android/core/Call;Ljava/lang/String;Ljava/lang/String;Ljava/util/Map;Lkotlin/coroutines/Continuation;ILjava/lang/Object;)Ljava/lang/Object;
public final fun sendStats (Ljava/util/Map;Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun setVideoFilter (Lio/getstream/video/android/core/call/video/VideoFilter;)V
public final fun setVisibility (Ljava/lang/String;Lstream/video/sfu/models/TrackType;Z)V
public final fun startHLS (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun startRecording (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
Expand Down Expand Up @@ -741,12 +743,10 @@ public final class io/getstream/video/android/core/StreamVideoBuilder {
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;J)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;JZ)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;JZLjava/lang/String;)V
public synthetic fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;JZLjava/lang/String;ILkotlin/jvm/internal/DefaultConstructorMarker;)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;J)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;JZ)V
public fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;JZLjava/lang/String;)V
public synthetic fun <init> (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;JZLjava/lang/String;ILkotlin/jvm/internal/DefaultConstructorMarker;)V
public final fun build ()Lio/getstream/video/android/core/StreamVideo;
public final fun getScope ()Lkotlinx/coroutines/CoroutineScope;
}
Expand Down Expand Up @@ -958,7 +958,6 @@ public final class io/getstream/video/android/core/call/connection/StreamPeerCon
public final fun makeAudioTrack (Lorg/webrtc/AudioSource;Ljava/lang/String;)Lorg/webrtc/AudioTrack;
public final fun makePeerConnection (Lkotlinx/coroutines/CoroutineScope;Lorg/webrtc/PeerConnection$RTCConfiguration;Lio/getstream/video/android/core/model/StreamPeerType;Lorg/webrtc/MediaConstraints;Lkotlin/jvm/functions/Function1;Lkotlin/jvm/functions/Function2;Lkotlin/jvm/functions/Function2;I)Lio/getstream/video/android/core/call/connection/StreamPeerConnection;
public static synthetic fun makePeerConnection$default (Lio/getstream/video/android/core/call/connection/StreamPeerConnectionFactory;Lkotlinx/coroutines/CoroutineScope;Lorg/webrtc/PeerConnection$RTCConfiguration;Lio/getstream/video/android/core/model/StreamPeerType;Lorg/webrtc/MediaConstraints;Lkotlin/jvm/functions/Function1;Lkotlin/jvm/functions/Function2;Lkotlin/jvm/functions/Function2;IILjava/lang/Object;)Lio/getstream/video/android/core/call/connection/StreamPeerConnection;
public final fun makeVideoSource (Z)Lorg/webrtc/VideoSource;
public final fun makeVideoTrack (Lorg/webrtc/VideoSource;Ljava/lang/String;)Lorg/webrtc/VideoTrack;
public final fun setAudioSampleCallback (Lkotlin/jvm/functions/Function1;)V
}
Expand Down Expand Up @@ -2740,6 +2739,19 @@ public final class io/getstream/video/android/core/call/stats/model/discriminato
public final fun fromAlias (Ljava/lang/String;)Lio/getstream/video/android/core/call/stats/model/discriminator/RtcReportType;
}

public abstract class io/getstream/video/android/core/call/video/BitmapVideoFilter : io/getstream/video/android/core/call/video/VideoFilter {
public fun <init> ()V
public abstract fun filter (Landroid/graphics/Bitmap;)V
}

public abstract class io/getstream/video/android/core/call/video/RawVideoFilter : io/getstream/video/android/core/call/video/VideoFilter {
public fun <init> ()V
public abstract fun filter (Lorg/webrtc/VideoFrame;Lorg/webrtc/SurfaceTextureHelper;)Lorg/webrtc/VideoFrame;
}

public class io/getstream/video/android/core/call/video/VideoFilter {
}

public final class io/getstream/video/android/core/dispatchers/DispatcherProvider {
public static final field INSTANCE Lio/getstream/video/android/core/dispatchers/DispatcherProvider;
public final fun getDefault ()Lkotlinx/coroutines/CoroutineDispatcher;
Expand Down Expand Up @@ -3081,9 +3093,6 @@ public final class io/getstream/video/android/core/filter/AndFilterObject : io/g
public fun toString ()Ljava/lang/String;
}

public abstract interface class io/getstream/video/android/core/filter/AudioFilter {
}

public final class io/getstream/video/android/core/filter/AutocompleteFilterObject : io/getstream/video/android/core/filter/FilterObject {
public final fun component1 ()Ljava/lang/String;
public final fun component2 ()Ljava/lang/String;
Expand Down Expand Up @@ -3295,9 +3304,6 @@ public final class io/getstream/video/android/core/filter/OrFilterObject : io/ge
public fun toString ()Ljava/lang/String;
}

public abstract interface class io/getstream/video/android/core/filter/VideoFilter {
}

public abstract interface annotation class io/getstream/video/android/core/internal/InternalStreamVideoApi : java/lang/annotation/Annotation {
}

Expand Down
3 changes: 3 additions & 0 deletions stream-video-android-core/build.gradle.kts
Original file line number Diff line number Diff line change
Expand Up @@ -136,6 +136,9 @@ dependencies {

implementation(libs.audioswitch)

// video filter dependencies
implementation (libs.libyuv)

// androidx
implementation(libs.androidx.core.ktx)
implementation(libs.androidx.lifecycle.runtime)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ import io.getstream.result.Result.Failure
import io.getstream.result.Result.Success
import io.getstream.video.android.core.call.RtcSession
import io.getstream.video.android.core.call.utils.SoundInputProcessor
import io.getstream.video.android.core.call.video.VideoFilter
import io.getstream.video.android.core.events.VideoEventListener
import io.getstream.video.android.core.internal.InternalStreamVideoApi
import io.getstream.video.android.core.model.MuteUsersData
Expand Down Expand Up @@ -126,6 +127,11 @@ public class Call(
/** The cid is type:id */
val cid = "$type:$id"

/**
* Set a custom [VideoFilter] that will be applied to the video stream coming from your device.
*/
var videoFilter: VideoFilter? = null

/**
* Called by the [CallHealthMonitor] when the ICE restarts failed after
* several retries. At this point we can do a full reconnect.
Expand Down
Loading

0 comments on commit aad2d66

Please sign in to comment.