iOS Video SDK
The EnableX iOS SDK (EnxRTCiOS) integrates real-time audio and video into native iOS applications. Built on WebRTC, it handles all signalling and media routing with EnableX servers and delivers session notifications through iOS Delegate Methods — the iOS pattern for event-driven communication.
- SDK v3.1.5 — released September 30, 2025
- Built in Swift (v3.0.0+); supports Objective-C apps with one limitation (see callout below)
- Requires iOS 15+ (v3.0.0); supported iOS 13+ (v2.3.2+); supports iOS 12+ for v2.3.3
- WebRTC Library v2.0.2 — released August 1, 2025
- Requires Socket.IO-Client-Swift
~> 16.1.0— signalling dependency, installed automatically via CocoaPods or SPM
Before writing any iOS code, your backend server must have three things in place. These are server-side responsibilities — none of these credentials or operations should ever be performed inside the iOS app itself.
1. App Credentials
Log in to the EnableX Portal and create a project to
obtain your app_id and app_key. These credentials authenticate your server when calling
the EnableX Video API. Never embed them in the iOS app — treat them as server secrets.
2. A Room
A Room is a virtual session space. Create it via the Video REST API from your server and store the returned
room_id. The room persists until you delete it, so you typically create it once per meeting
and reuse it across participants.
3. A Token per Participant
Each participant who joins the session needs their own short-lived token. Your server generates the token
by calling the Create Token API with the room_id and participant details, then passes it to
the iOS app. The app uses this token to authenticate when calling connect().
app_key is required to generate tokens —
if it is embedded in the app, it can be extracted and abused to create unauthorized sessions.
Download the latest SDK and WebRTC library below, or install via CocoaPods (recommended).
⇩ Download iOS SDK v3.1.5 ⇩ Download WebRTC Library
iOS SDK v3.1.5 — Released September 30, 2025 | WebRTC Library v2.0.2 — Released August 1, 2025
Method 1 — CocoaPods (Recommended)
CocoaPods is the recommended integration path. It automatically resolves and installs the SDK together with
its WebRTC dependency, so you don't need to manage the EnablexWebRTC.xcframework separately.
Use this method unless your project has a specific reason to avoid dependency managers.
Step 1 — Install CocoaPods if not already present:
sudo gem install cocoapods
Step 2 — In your Xcode project directory, initialise a Podfile:
pod init
Step 3 — Open the generated Podfile and add the following pods:
pod 'EnxRTCiOS'
pod 'Socket.IO-Client-Swift', '~> 16.1.1'
Step 4 — Install the pods:
pod install
Step 5 — From this point forward, always open your project using the .xcworkspace file that CocoaPods generated, not the original .xcodeproj.
EnablexWebRTC framework installs automatically when EnxRTCiOS is installed via
CocoaPods. You do not need to download or link the WebRTC library manually when using this method.
Method 2 — Manual Integration (.xcframework)
If CocoaPods is not suitable for your project, you can manually add both frameworks to Xcode. Download both ZIP files from the buttons above, then follow these steps:
Step 1 — Extract both ZIP archives. You will get two .xcframework bundles:
EnxRTCiOS.xcframework and EnablexWebRTC.xcframework.
Step 2 — In Xcode, select your app target, go to the General tab, and
scroll to Frameworks, Libraries, and Embedded Content. Drag both .xcframework
bundles into that section.
Step 3 — For each framework, set the embed option to Embed & Sign.
Step 4 — Add Socket.IO-Client-Swift v16.1.0 either manually or via its own
CocoaPod. The SDK requires this library for its signalling layer.
Method 3 — Swift Package Manager (SPM)
SPM is natively supported in Xcode and requires no additional tooling. When added via SPM,
EnxRTCiOS automatically fetches and includes EnablexWebRTC as a dependency —
no separate WebRTC download needed.
EnxRTCiOS and EnablexWebRTC
are compiled for arm64 only. Simulator builds are not supported when using SPM.
Requirements: iOS 13.0+, Xcode 13.0+
Step 1 — In Xcode, go to File → Add Package Dependencies…
Step 2 — Enter the repository URL:
https://github.com/EnableX/EnxRTCiOS.git
Step 3 — Select the version rule — choose Up to Next Major from 3.1.4. Click Add Package and add it to your app target.
Alternatively, via Package.swift:
dependencies: [
.package(url: "https://github.com/EnableX/EnxRTCiOS.git", from: "3.1.4")
]
Step 4 — Import the framework in your Swift file:
import EnxRTCiOS
// EnablexWebRTC is automatically available through EnxRTCiOS
Required Capabilities (Xcode)
Real-time audio and video requires explicit device permission grants. Enable the following in your Xcode target under Signing & Capabilities:
- Camera — required for video capture
- Microphone — required for audio capture
- Background Modes — enable Audio, AirPlay, and Picture in Picture to keep audio running when the app moves to the background
You must also declare usage descriptions in your Info.plist. iOS will crash the app at runtime
if it attempts to access the camera or microphone without these keys present:
<key>NSCameraUsageDescription</key>
<string>Camera access is required for video calls.</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone access is required for audio calls.</string>
iOS uses the Delegate Pattern instead of event listeners. Rather than attaching handlers to
event names, you designate an object as the room's delegate by making it conform to the
EnxRoomDelegate protocol. When something happens in the room — a participant joins,
recording starts, a stream is published — the SDK calls the corresponding delegate method on your
designated object. This is the standard iOS approach for asynchronous, event-driven communication.
You must set up your delegate object and assign it to the EnxRoom instance
before calling connect(). Delegate calls begin firing as soon as the connection
is established, so a delegate assigned after connect() risks missing critical early events.
Setting up the Delegate
Step 1 — Declare conformance to the required protocols in your ViewController header:
@interface MyViewController : UIViewController <EnxRoomDelegate, EnxStreamDelegate>
@end
Step 2 — Create the EnxRoom instance and assign your ViewController as
the delegate:
EnxRoom *room = [[EnxRoom alloc] init];
room.delegate = self;
Step 3 — Implement the delegate methods your app needs. The SDK calls these automatically at the appropriate moments:
// Called when the local client has successfully connected to the room
- (void)room:(EnxRoom *)room didConnect:(NSDictionary *)roomMetadata {
NSLog(@"Connected to room: %@", roomMetadata);
}
// Called when another participant joins the session
- (void)room:(EnxRoom *)room userDidJoined:(NSArray *)data {
NSLog(@"User joined: %@", data);
}
// Called when a remote stream becomes available — subscribe to receive it
- (void)room:(EnxRoom *)room didAddedStream:(EnxStream *)stream {
[room subscribe:stream];
}
delegate before calling connect(). Delegate methods like
room:didConnect: and room:didAddedStream: fire immediately as the connection is
established. Missing them means missing the initial room state — including remote streams that were already
active when your client joined.
Room-level vs Stream-level Delegates
The SDK separates event notifications into two protocol layers, each covering a different scope:
-
Room delegate (
EnxRoomDelegate) — handles session-wide events: connection and disconnection, recording state, participant joins and drops, moderation actions, chat messages, and active talker updates. -
Stream delegate (
EnxStreamDelegate) — handles events scoped to a specific stream: mute and unmute notifications, media access changes, and video quality adjustments.
Implement EnxStreamDelegate on any object that needs to respond to changes on a particular
stream, and assign it via stream.delegate = self after subscribing to that stream.
The iOS SDK is organised into four primary classes. Understanding what each class is responsible for helps you know where to look when implementing a feature or debugging an issue.
EnxRtc, not EnxRoom. EnxRtc is the
recommended entry point. Call joinRoom() on it and the SDK handles stream creation, room
initialisation, and connection internally. You receive the EnxRoom reference back through
the didConnect callback — use that reference for all subsequent room-level operations.
Do not instantiate EnxRoom directly unless you specifically require low-level session control.
| Class | Purpose |
|---|---|
EnxRtc |
Entry point for quick session join via the convenience method joinRoom(). Use this when you want the SDK to handle stream creation and room connection in a single call. |
EnxRoom |
Core room object. Manages connection lifecycle (connect, disconnect), stream publishing and subscribing, and all room-level controls such as recording, moderation, and chat. |
EnxStream |
Represents an individual media stream, either local (your camera/mic) or remote (another participant's stream). Exposes controls for muting, camera switching, and video quality. |
EnxPlayerView |
A UIView subclass that renders a video stream. Add it to your view hierarchy and attach a stream to display video. |
When an SDK operation fails, the error is not thrown as an exception. Instead, the error information is delivered through the relevant delegate method's response array. Knowing the structure of this error object lets you handle failures correctly and surface meaningful messages to the user.
The error object returned in delegate responses has the following JSON structure:
{
"errorCode": 5007,
"msg": "Unauthorized Access",
"desc": "Only moderators can start recording"
}
| Field | Type | Description |
|---|---|---|
errorCode |
Number | Numeric code that uniquely identifies the error type |
msg |
String | Short, human-readable error message |
desc |
String | Optional. A more detailed explanation of what caused the error and how to resolve it |
<null> is at index 1. When an error occurs,
<null> is at index 0 and the error object is at index 1. Always check index 1 for
errors before reading result data at index 0.
The iOS SDK documentation is divided into focused topic pages. Choose the area you are working on: