Skip to content

Commit

Permalink
Merge branch 'main' into expo-ringing
Browse files Browse the repository at this point in the history
  • Loading branch information
vishalnarkhede authored Oct 9, 2023
2 parents dbc91a7 + b5716ef commit 9310866
Show file tree
Hide file tree
Showing 61 changed files with 1,712 additions and 367 deletions.
14 changes: 14 additions & 0 deletions packages/client/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,20 @@

This file was generated using [@jscutlery/semver](https://github.com/jscutlery/semver).

### [0.3.30](https://github.com/GetStream/stream-video-js/compare/@stream-io/video-client-0.3.29...@stream-io/video-client-0.3.30) (2023-10-06)


### Features

* ScreenShare Audio support ([#1118](https://github.com/GetStream/stream-video-js/issues/1118)) ([5b63e1c](https://github.com/GetStream/stream-video-js/commit/5b63e1c5f52c76e3761e6907bd3786c19f0e5c6d))

### [0.3.29](https://github.com/GetStream/stream-video-js/compare/@stream-io/video-client-0.3.28...@stream-io/video-client-0.3.29) (2023-10-05)


### Bug Fixes

* ensure stable sort ([#1130](https://github.com/GetStream/stream-video-js/issues/1130)) ([f96e1af](https://github.com/GetStream/stream-video-js/commit/f96e1af33ef9e60434e07dc0fba5161f20b8eba6))

### [0.3.28](https://github.com/GetStream/stream-video-js/compare/@stream-io/video-client-0.3.27...@stream-io/video-client-0.3.28) (2023-09-28)


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,7 @@ The `StreamVideoParticipant` object contains the following information:
| `audioStream` | The published audio `MediaStream`. |
| `videoStream` | The published video `MediaStream`. |
| `screenShareStream` | The published screen share `MediaStream`. |
| `screenShareAudioStream` | The published screen share audio `MediaStream`. |
| `isLocalParticipant` | It's `true` if the participant is the local participant. |
| `pin` | Holds pinning information. |
| `reaction` | The last reaction this user has sent to this call. |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,11 @@ If you want to see the device management API in action, you can check out [the s
### Start-stop camera

```typescript
const toggleCamera = () => {
call.camera.toggle();
call.camera.toggle();

// or
call.camera.enable();
call.camera.disable();
};
// or
call.camera.enable();
call.camera.disable();
```

Here is how you can access the status:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
---
id: screensharing
title: Screen Sharing
description: Managing Screen Sharing
---

If you want to see the device management API in action, you can check out [the sample app](https://github.com/GetStream/stream-video-js/tree/main/sample-apps/client/ts-quickstart).

## Screen Sharing

### Start/Stop Screen Sharing

```typescript
call.screenShare.toggle();

// or
call.screenShare.enable();
call.screenShare.disable();
```

### Screen Sharing Status

Here is how you can access the status of screen sharing:

```typescript
call.screenShare.state.status; // enabled, disabled or undefined

// or, if you want to subscribe to changes
call.screenShare.state.status$.subscribe((status) => {
// enabled, disabled or undefined
});
```

### Screen Sharing Settings

The behavior of the screen share video track can be customized, and a few parameters can be set:

```typescript
call.screenShare.setSettings({
maxFramerate: 15, // will be clamped between 1 and 15 fps
maxBitrate: 1500000, // will use at most 1.5Mbps
});

call.screenShare.enable();
```

### Render Screen Share

Please follow our [Playing Video and Audio guide](../../guides/playing-video-and-audio/).

## Screen Share Audio

### Start/Stop Screen Share Audio

```typescript
// enable it
call.screenShare.enableScreenShareAudio();

// publish video and audio (if available, and supported by the browser)
call.screenShare.enable();

// disable it
call.screenShare.disableScreenShareAudio();
```

### Play Screen Share Audio

Please follow our [Playing Video and Audio guide](../../guides/playing-video-and-audio/).

### Caveats

Screen Share Audio has limited support across browsers and platforms.
For most up-to-date information, please take a look at [Browser Compatibility](https://developer.mozilla.org/en-US/docs/Web/API/Screen_Capture_API/Using_Screen_Capture#browser_compatibility).

In addition to that, there are a [few caveats](https://caniuse.com/?search=getDisplayMedia) that you should be aware of:

- On Windows, the entire system audio can be captured, but on MacOS and Linux, only the audio of a tab can be captured.
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ This method can be found in `call.bindAudioElement`. It takes two arguments:

- the audio element to bind to
- the participant's `sessionId`
- the kind of track to bind to (either `audioTrack` or `screenShareAudioTrack` for screen sharing)

This method needs to be called only once, usually after the element is mounted in the DOM.

Expand All @@ -73,6 +74,10 @@ if (!audioElement) {

// bind the audio element to the participant's audio track
// use the returned `unbind()` function to unbind the audio element
const unbind = call.bindAudioElement(audioElement, participant.sessionId);
const unbind = call.bindAudioElement(
audioElement,
participant.sessionId,
'audioTrack',
);
}
```
2 changes: 1 addition & 1 deletion packages/client/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@stream-io/video-client",
"version": "0.3.28",
"version": "0.3.30",
"packageManager": "[email protected]",
"main": "dist/index.cjs.js",
"module": "dist/index.es.js",
Expand Down
68 changes: 56 additions & 12 deletions packages/client/src/Call.ts
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ import {
} from './gen/coordinator';
import { join, reconcileParticipantLocalState } from './rtc/flows/join';
import {
AudioTrackType,
CallConstructor,
CallLeaveOptions,
DebounceType,
Expand All @@ -76,6 +77,7 @@ import {
StreamVideoParticipant,
StreamVideoParticipantPatches,
SubscriptionChanges,
TrackMuteType,
VideoTrackType,
VisibilityState,
} from './types';
Expand Down Expand Up @@ -120,6 +122,7 @@ import {
CameraDirection,
CameraManager,
MicrophoneManager,
ScreenShareManager,
SpeakerManager,
} from './devices';

Expand Down Expand Up @@ -168,6 +171,11 @@ export class Call {
*/
readonly speaker: SpeakerManager;

/**
* Device manager for the screen.
*/
readonly screenShare: ScreenShareManager;

/**
* The DynascaleManager instance.
*/
Expand Down Expand Up @@ -281,6 +289,7 @@ export class Call {
this.camera = new CameraManager(this);
this.microphone = new MicrophoneManager(this);
this.speaker = new SpeakerManager();
this.screenShare = new ScreenShareManager(this);
}

private registerEffects() {
Expand Down Expand Up @@ -768,9 +777,21 @@ export class Call {
const {
audioStream,
videoStream,
screenShareStream: screenShare,
screenShareStream,
screenShareAudioStream,
} = localParticipant;

let screenShare: MediaStream | undefined;
if (screenShareStream || screenShareAudioStream) {
screenShare = new MediaStream();
screenShareStream?.getVideoTracks().forEach((track) => {
screenShare?.addTrack(track);
});
screenShareAudioStream?.getAudioTracks().forEach((track) => {
screenShare?.addTrack(track);
});
}

// restore previous publishing state
if (audioStream) await this.publishAudioStream(audioStream);
if (videoStream) await this.publishVideoStream(videoStream);
Expand Down Expand Up @@ -1081,7 +1102,6 @@ export class Call {
* Consecutive calls to this method will replace the audio stream that is currently being published.
* The previous audio stream will be stopped.
*
*
* @param audioStream the audio stream to publish.
*/
publishAudioStream = async (audioStream: MediaStream) => {
Expand Down Expand Up @@ -1112,10 +1132,13 @@ export class Call {
* Consecutive calls to this method will replace the previous screen-share stream.
* The previous screen-share stream will be stopped.
*
*
* @param screenShareStream the screen-share stream to publish.
* @param opts the options to use when publishing the stream.
*/
publishScreenShareStream = async (screenShareStream: MediaStream) => {
publishScreenShareStream = async (
screenShareStream: MediaStream,
opts: PublishOptions = {},
) => {
// we should wait until we get a JoinResponse from the SFU,
// otherwise we risk breaking the ICETrickle flow.
await this.assertCallJoined();
Expand All @@ -1140,7 +1163,18 @@ export class Call {
screenShareStream,
screenShareTrack,
TrackType.SCREEN_SHARE,
opts,
);

const [screenShareAudioTrack] = screenShareStream.getAudioTracks();
if (screenShareAudioTrack) {
await this.publisher.publishStream(
screenShareStream,
screenShareAudioTrack,
TrackType.SCREEN_SHARE_AUDIO,
opts,
);
}
};

/**
Expand Down Expand Up @@ -1252,6 +1286,13 @@ export class Call {
dimension: p.screenShareDimension,
});
}
if (p.publishedTracks.includes(TrackType.SCREEN_SHARE_AUDIO)) {
subscriptions.push({
userId: p.userId,
sessionId: p.sessionId,
trackType: TrackType.SCREEN_SHARE_AUDIO,
});
}
}
// schedule update
this.trackSubscriptionsSubject.next({ type, data: subscriptions });
Expand Down Expand Up @@ -1414,7 +1455,7 @@ export class Call {
*
* @param type the type of the mute operation.
*/
muteSelf = (type: 'audio' | 'video' | 'screenshare') => {
muteSelf = (type: TrackMuteType) => {
const myUserId = this.currentUserId;
if (myUserId) {
return this.muteUser(myUserId, type);
Expand All @@ -1426,7 +1467,7 @@ export class Call {
*
* @param type the type of the mute operation.
*/
muteOthers = (type: 'audio' | 'video' | 'screenshare') => {
muteOthers = (type: TrackMuteType) => {
const trackType = muteTypeToTrackType(type);
if (!trackType) return;
const userIdsToMute: string[] = [];
Expand All @@ -1445,10 +1486,7 @@ export class Call {
* @param userId the id of the user to mute.
* @param type the type of the mute operation.
*/
muteUser = (
userId: string | string[],
type: 'audio' | 'video' | 'screenshare',
) => {
muteUser = (userId: string | string[], type: TrackMuteType) => {
return this.streamClient.post<MuteUsersResponse, MuteUsersRequest>(
`${this.streamClientBasePath}/mute_users`,
{
Expand All @@ -1463,7 +1501,7 @@ export class Call {
*
* @param type the type of the mute operation.
*/
muteAllUsers = (type: 'audio' | 'video' | 'screenshare') => {
muteAllUsers = (type: TrackMuteType) => {
return this.streamClient.post<MuteUsersResponse, MuteUsersRequest>(
`${this.streamClientBasePath}/mute_users`,
{
Expand Down Expand Up @@ -1952,11 +1990,17 @@ export class Call {
*
* @param audioElement the audio element to bind to.
* @param sessionId the session id.
* @param trackType the kind of audio.
*/
bindAudioElement = (audioElement: HTMLAudioElement, sessionId: string) => {
bindAudioElement = (
audioElement: HTMLAudioElement,
sessionId: string,
trackType: AudioTrackType = 'audioTrack',
) => {
const unbind = this.dynascaleManager.bindAudioElement(
audioElement,
sessionId,
trackType,
);

if (!unbind) return;
Expand Down
7 changes: 3 additions & 4 deletions packages/client/src/devices/CameraManager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ export class CameraManager extends InputMediaDeviceManager<CameraManagerState> {
protected getDevices(): Observable<MediaDeviceInfo[]> {
return getVideoDevices();
}

protected getStream(
constraints: MediaTrackConstraints,
): Promise<MediaStream> {
Expand All @@ -82,14 +83,12 @@ export class CameraManager extends InputMediaDeviceManager<CameraManagerState> {
}
return getVideoStream(constraints);
}

protected publishStream(stream: MediaStream): Promise<void> {
return this.call.publishVideoStream(stream);
}

protected stopPublishStream(stopTracks: boolean): Promise<void> {
return this.call.stopPublish(TrackType.VIDEO, stopTracks);
}

protected getTrack() {
return this.state.mediaStream?.getVideoTracks()[0];
}
}
Loading

0 comments on commit 9310866

Please sign in to comment.