Using the Avaya Client SDK, you can easily integrate the ability for users of your application to make and receive audio or video calls.
To make video call, you must complete the following activities:
Creating a CSCall object allows you to set various properties for the call before the call is actually placed.
The CSCall object is created from the CSCallService.
CSCallService* callService = user.callService;
CSCall* call = [callService createCall];
You can specify the phone number to dial by setting the remote address.
call.remoteAddress = callNumber;
To monitor call events, the <CSCallDelegate> protocol provided by the CSCall object can be used. This protocol provides notification as the call state changes.
Your application can define an object that implements the <CSCallDelegate> methods and can add it to the CSCall object to receive callback notifications.
@interface AppCallHandler()
...
@end
@implementation AppCallHandler
- (void)callDidStart:(CSCall *)call {
// Called to report that call has started (ie, call is in progress).
// Add code here to update the UI as appropriate.
}
- (void)callDidBeginRemoteAlerting:(CSCall *)call
withEarlyMedia:(BOOL)hasEarlyMedia {
// Called to report that an outgoing call is ringing at the far end.
// Add code here to update the UI as appropriate.
}
- (void)callDidEstablish:(CSCall *)call {
// Called to report that an outgoing call has been established
// (ie, far end has answered and speechpath has been established).
// Add code here to update the UI as appropriate.
}
- (void)callDidEnd:(CSCall *)call reason:(CSCallEndReason)reason {
// Called to report that call has ended.
// Add code here to update the UI as appropriate.
}
- (void)call:(CSCall *)call didFailWithError:(NSError *)error {
// Called to report that call has failed and the failure reason
// is described in the error parameter.
// Add code here to update the UI as appropriate.
}
- (void)updateVideoChannels:(NSArray *)videoChannels {
// Called to report that video channels of the call updated
// Add code here to update the UI as appropriate.
}
...
@end
You can instantiate an application call handler (AppCallHandler) and add that as a delegate to the call.
AppCallHandler *callHandler = [[AppCallHandler alloc] init];
call.delegate = callHandler;
The application needs to obtain one or two resources when answering a call with video. If the application wishes to transmit video, it should find a video camera. Camera availability is verified through the CSVideoCapturerOSX class. This class manages most aspects of camera operation, including verification of available hardware. Regardless of camera need, a video interface is required.
VideoInterface* videoInterface = user.mediaServices.videoInterface;
CSVideoCapturerOSX* videoCapturer = [[CSVideoCapturerOSX alloc] init];
To answer the incoming call with video, you should call setVideoMode with CSVideoModeSendReceive and then call the start method on the incoming call object.
[call setVideoMode:CSVideoModeSendReceive
completionHandler:^(NSError *handler)];
[call start];
Video transmission/rendering could be updated as follows:
- (void)updateVideoChannels:(NSArray *)videoChannels {
CSVideoChannel *videoChannel = videoChannels[0];
if (videoChannel.enabled == YES) {
// Start video rendering
if (videoChannel.negotiatedDirection == CSMediaDirectionSendReceive
|| videoChannel.negotiatedDirection == CSMediaDirectionReceiveOnly) {
// An object on the view that is responsible for video rendering
CSVideoCapturerOSX* remoteSink = [[CSVideoRendereOSX alloc] init];
[[videoInterface getRemoteVideoSource:videoChannel.channelId]
setVideoSink:remoteSink];
}
// Start video transmission
if (videoChannel.negotiatedDirection == CSMediaDirectionSendReceive
|| videoChannel.negotiatedDirection == CSMediaDirectionSendOnly) {
[videoCapturer setVideoSink:
[videoInterface getLocalVideoSink: videoChannel.channelId]];
CSVideoCaptureFormat* format = [[CSVideoCaptureFormat alloc] init];
format.minRate = 15;
format.maxRate = 30;
format.maxWidth = 1280;
format.maxHeight = 720;
CSVideoCaptureDevice* videoCaptureDevice = videoCapturer.
availableDevices.firstObject;
[videoCapturer openVideoCaptureDevice:videoCaptureDevice
withFormat:format
withCompletion:^(NSError *error)
}
}
else {
[[videoInterface getRemoteVideoSource:videoChannel.channelId]
setVideoSink:nil];
[videoCapturer setVideoSink: nil];
}
}
}
To terminate the call from the application, you can use the end() function on the call object.
[call end];
The onCallEnded callback event is sent to the call listener when the call has been ended. Use this event to update the UI of your application. Ending the call will deallocate the video channel and release the video camera automatically, but any render surface allocated by the application will need to be released by it. Again, see SdkSampleApp for the steps.