This repo has been archived. Please use the 100ms iOS SDK provided here: https://github.com/100mslive/100ms-ios-sdk
This repository contains example code written in Swift.
- Install CocoaPods 1.7.5 or newer.
- Run
pod install
from the root directory of this project. CocoaPods will installHMSVideo.framework
and then set up anxcworkspace
. - Open
HMSVideoExample.xcworkspace
.
Note: You may need to update the CocoaPods Master Spec Repo by running pod repo update master
in order to fetch the latest specs for HMSVideo.
To get started with the sample application follow these steps:
- Open the
HMSVideo_Example
target from theHMSVideoExample.xcworkspace
in Xcode and go toSigning & Capabilities
tab
-
Replace the Bundle identifier value with something unique to your org, and select a Team for code signing.
-
Open
MeetingViewController.swift
-
Replace the tokenServerURL constant value with the correct url (see instructions below for token api server generation)
-
Build and run the app
You will need to send a token - generated by a combination of access_key
, customer_secret
, customer_id
, app_id
, peer_id
and room_id
when initiating 100ms client
- Follow instructions here: https://100ms.gitbook.io/100ms/helpers/runkit
- Publish your runkit and copy the endpoint URL (click on it and copy the URL that opens).
- Replace the tokenServerURL constant value in
MeetingViewController.swift
with the copied value
Here you will find everything you need to build experiences with video using 100ms iOS SDK. Dive into our SDKs, quick starts, add real-time video, voice, and screen sharing to your web and mobile applications.
- iOS 10.0+
- Xcode 11+
CocoaPods is a dependency manager for Cocoa projects. For usage and installation instructions, visit their website. To integrate Brytecam SDK into your Xcode project using CocoaPods, specify it in your Podfile
:
pod 'HMSVideo', '~> 0.10.0'
Checkout example app at https://github.com/100mslive/hmsvideo-ios/tree/master/Example
- Room - A room represents a real-time audio, data, video and/or screenshare session, the basic building block of the Brytecam Video SDK
- Stream - A stream represents real-time audio, video and data media streams that are shared to a room
- Peer/Participant - A peer represents all participants connected to a room (other than the local participant)
- Publish - A local participant can share its audio, video and data tracks by "publishing" its tracks to the room
- Subscribe - A local participant can stream any peer's audio, video and data tracks by "subscribing" to their tracks
- Broadcast - A local participant can send any message/data to all peers in the room
This will instantiate an HMSClient
object
//Create an HMSPeer instance for local peer
let peer = HMSPeer(name: userName, authToken: "INSERT TOKEN HERE")
let config = HMSClientConfig()
//config.endpoint = "Override endpoint URL if needed"
//Create a 100ms video client
client = HMSClient(peer: peer, config: config)
Click here to see how to generate your token
Use wss://prod-in.100ms.live/ws
as endpoint URL for production and wss://staging-in.100ms.live/ws
as endpoint URL for staging
After joining, immediately add listeners to listen to peers joining, new streams being added to the room
client.onPeerJoin = { (room, peer) in
// Update UI if needed
}
client.onPeerLeave = { (room, peer) in
// Update UI if needed
}
client.onStreamAdd = { (room, peer, streamInfo) in
// Subscribe to the stream if needed
}
client.onStreamRemove = { (room, peer, streamInfo) in
// Remove remote stream view if needed
}
client.onBroadcast = { (room, peer, message) in
// update UI if needed
}
client.onConnect = {
// Client connected, this is a good place to call join(room)
}
client.onDisconnect = { error in
// Connection lost or could not be established.
// Good place to retry or show an error to the user.
}
After instantiating HMSClient
, connect to 100ms' server
//The client will connect to the WebSocket channel provided through the config
client.connect
//Pass the unique id for the room here as a String
let room = HMSRoom(roomId: roomName)
client.join(room) { (success, error) in
//check for error and publish a local stream
}
Generate a unique roomid
for each session to avoid conflicts
//You can set codec, bitrate, framerate, etc here.
let constraints = HMSMediaStreamConstraints()
constraints.shouldPublishAudio = true
constraints.shouldPublishVideo = true
constraints.codec = .VP8
constraints.bitrate = 256
constraints.frameRate = 25
constraints.resolution = .QVGA
let localStream = client.getLocalStream(constraints)
Please use the following settings for video that looks good in postcard-sized videos - codec:VP8
, bitrate 256
, framerate 25
. We will extend this in the future to add more options including front/back camera
Apple requires your app to provide static messages to display to the user when the system asks for camera or microphone permission:
If your app uses device cameras, include the NSCameraUsageDescription key in your app’s Info.plist file.
If your app uses device microphones, include the NSMicrophoneUsageDescription key in your app’s Info.plist file.
For each key, provide a message that explains to the user why your app needs to capture media, so that the user can feel confident granting permission to your app.
Important
If the appropriate key is not present in your app’s Info.plist file when your app requests authorization or attempts to use a capture device, the system terminates your app.
This will not be covered by v0.10 SDK. Coming soon.
//The following code is a sample.
//Get the video capturer and video track
let videoCapturer = stream.videoCapturer
let localVideoTrack = stream.videoTracks?.first
//Begin capturing video from the camera
videoCapturer?.startCapture()
//Create a view for rendering video track and add to the UI hierarchy
if let track = localVideoTrack {
let videoView = HMSVideoView()
videoView.setVideoTrack(track)
view.addSubview(videoView)
}
A local participant can share her audio, video and data tracks by "publishing" its tracks to the room
client.publish(localStream, room: room, completion: { (stream, error) in
//Handle error if any, update UI if needed
})
This method "subscribes" to a peer's stream. This should ideally be called in the onStreamAdd
listener
client.subscribe(streamInfo, room: room, completion: { (stream, error) in
//Handle error if any, update UI if needed
})
This method broadcasts a payload to all participants
client.broadcast(message, room: room, completion: { (stream, error) in
//Handle error if any, update UI if needed
})
client.unpublish(stream, room: room, completion: { (stream, error) in
//Handle error if any, update UI if needed
})
client.unsubscribe(stream, room: room, completion: { (stream, error) in
//Handle error if any, update UI if needed
})
//The client will disconnect from the WebSocket channel provided
client.disconnect();