更新時間:2019-11-20
在開發(fā)的過程中請滿足如下環(huán)境要求。
環(huán)境和工具名稱 |
版本要求 |
---|---|
操作系統(tǒng) |
Mac OSX High Sierra 10.13.6及以上 |
Xcode |
Xcode9.4.1及以上 |
iOS系統(tǒng)版本 |
iOS9.0及以上,64位設(shè)備 |
會議云服務(wù)的用戶帳號 |
帳號可來源于華為公有云。開通方法請參見“開發(fā)前準備”。 |
由于TUP的庫文件是靜態(tài)庫,因此依賴庫需要手動添加,下面只介紹libc++.tbd的添加,其他庫添加方式相同。
Foundation.framework :包含Cocoa Foundation層的類和方法。
UIKit.framework :包含iOS應(yīng)用程序用戶界面層使用的類和方法。
OpenGLES.framework :包含OpenGL ES接口。OpenGL ES框架是OpenGL跨平臺2D和3D渲染庫的跨平臺版本。
AudioToolbox.framework :包含處理音頻流數(shù)據(jù)以及播放或錄制音頻的接口。
VedioToolbox.framework:提供硬解碼和硬編碼API
AVFoundation.framework :包含播放或錄制音頻的Objective-C接口。
CoreAudio.framework:包含Core Audio框架使用的各種數(shù)據(jù)類型。
CoreData.framework :包含管理應(yīng)用程序數(shù)據(jù)模型的接口。
MediaPlayer.framework :包含顯示全屏視頻的接口。
SystemConfiguration.framework:包含用于處理設(shè)備網(wǎng)絡(luò)配置的接口。
Security.framework :包含管理證書、公鑰私鑰以及信任策略的接口。
QuartzCore.framework :包含Core Animation接口。
CoreVideo.framework :包含操作音頻和視頻的底層例程。
CoreMedia.framework :包含操作音頻和視頻的底層例程。
AVFoundation.framework :包含播放或錄制音頻的Objective-C接口。
MobileCoreServices.framework :定義系統(tǒng)支持的統(tǒng)一類型標識符(UTIs).
按照此方法依次添加“Privacy - Camera Usage Description”、“Privacy - Contacts Usage Description”和“Privacy - Photo Library Usage Description”。
麥克風權(quán)限:Privacy - Microphone Usage Description
相機權(quán)限: Privacy - Camera Usage Description
相冊權(quán)限: Privacy - Photo Library Usage Description
通訊錄權(quán)限:Privacy - Contacts Usage Description
源碼鏈接:ViewController.mm.
代碼參考
單擊上述結(jié)構(gòu)中的文件名獲取文件源碼,并向各文件填充代碼。部分代碼片段參考如下:
TSDK_VOID onTSDKNotifications(TSDK_UINT32 msgid, TSDK_UINT32 param1, TSDK_UINT32 param2, TSDK_VOID *data) { NSLog(@"onTUPLoginNotifications : %#x",msgid); dispatch_async(dispatch_get_main_queue(), ^{ switch (msgid) { case TSDK_E_LOGIN_EVT_AUTH_SUCCESS: { NSLog(@"Uportal login success !"); [ViewController showMessages:@"Uportal login success"]; } break; case TSDK_E_LOGIN_EVT_AUTH_FAILED: { NSLog(@"Uportal login fail !"); [ViewController showMessages:@"Uportal login fail"]; } break; default: break; } }); } -(BOOL)initUportalLoginService { TSDK_S_LOG_PARAM logParam; memset(&logParam, 0, sizeof(TSDK_S_LOG_PARAM)); NSString *logPath = [[NSHomeDirectory() stringByAppendingPathComponent:@"Documents"] stringByAppendingString:@"/TUPC60log"]; NSString *path = [logPath stringByAppendingString:@"/tsdk"]; logParam.level = TSDK_E_LOG_DEBUG; logParam.file_count = 1; logParam.max_size_kb = 4*1024; strcpy(logParam.path, [path UTF8String]); TSDK_RESULT result = tsdk_set_config_param(TSDK_E_CONFIG_LOG_PARAM, &logParam); TSDK_S_APP_INFO_PARAM app_info; memset(&app_info, 0, sizeof(TSDK_S_APP_INFO_PARAM)); app_info.client_type = TSDK_E_CLIENT_MOBILE; strcpy(app_info.product_name, "SoftClient on Mobile"); app_info.support_audio_and_video_call = TSDK_TRUE; app_info.support_ctd = TSDK_TRUE; app_info.support_audio_and_video_conf = TSDK_TRUE; app_info.support_enterprise_address_book = TSDK_TRUE; result = tsdk_init(&app_info ,&onTSDKNotifications); return result == TSDK_SUCCESS ? YES : NO; }
NSString *account = @"Account"; NSString *password = @"Password"; NSString *serverAddress = @"192.168.1.100"; int port = 443; -(BOOL)loginAuthorizeWithServerAddress:(NSString *)serverAddress port:(int)port account:(NSString *)account password:(NSString *)password { TSDK_S_LOGIN_PARAM loginParam; memset(&loginParam, 0, sizeof(TSDK_S_LOGIN_PARAM)); loginParam.user_id = 1; loginParam.auth_type = TSDK_E_AUTH_NORMAL; strcpy(loginParam.user_name, [account UTF8String]); strcpy(loginParam.password, [password UTF8String]); loginParam.server_type = TSDK_E_SERVER_TYPE_PORTAL; strcpy(loginParam.server_addr, [serverAddress UTF8String]); loginParam.server_port = (TSDK_UINT16)port; TSDK_RESULT result = tsdk_login(&loginParam); return result == TSDK_SUCCESS ? YES : NO; }
TSDK_VOID onTSDKNotifications(TSDK_UINT32 msgid, TSDK_UINT32 param1, TSDK_UINT32 param2, TSDK_VOID *data) { NSLog(@"onTUPLoginNotifications : %#x",msgid); dispatch_async(dispatch_get_main_queue(), ^{ switch (msgid) { case TSDK_E_LOGIN_EVT_AUTH_SUCCESS: { NSLog(@"Uportal login success !"); [ViewController showMessages:@"Uportal login success"]; } break; case TSDK_E_LOGIN_EVT_AUTH_FAILED: { NSLog(@"Uportal login fail !"); [ViewController showMessages:@"Uportal login fail"]; } break; default: break; } }); }
分類 |
輸入信息 |
說明 |
備注 |
---|---|---|---|
鑒權(quán)信息 |
ServerAddress |
eSDK服務(wù)器地址 |
入?yún)⒉荒転榭?/p> |
Server Port |
eSDK服務(wù)器端口號 |
||
Account |
eSDK登錄名 |
||
Password |
eSDK鑒權(quán)密碼 |
入?yún)⒉荒転榭?/p> |
以上信息需要在成功預(yù)約華為遠程實驗室后,從遠程實驗室獲取。
描述
iOS端屏幕共享需要進行iOS Extension的開發(fā)進行實現(xiàn),以下步驟用于說明如何集成SDK進行屏幕共享Extension的開發(fā),另外由于相關(guān)特性對系統(tǒng)能力的依賴,因此需要iOS12及以上版本才能支持屏幕共享功能。
業(yè)務(wù)流程
ReplayKit.framework:系統(tǒng)庫
CloudLinkMeetingScreenShare.framework:SDK提供的庫
#import "SampleHandler.h" #import <CLoudLinkMeetingScreenShare/ScreenShareManager.h> @interface SampleHandler()<ScreenShareManagerDelegate> @property (strong, nonatomic) ScreenShareManager *screenShareManager; @end @implementation SampleHandler - (instancetype)init { if (self = [super init]) { self.screenShareManager = [[ScreenShareManager alloc] initWithAppGroupIdentifier:自己申請的appGroupIdentifier]; self.screenShareManager.delegate = self; } return self; } - (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo { // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional. [self.screenShareManager broadcastStartedWithSetupInfo:setupInfo]; } - (void)dealloc { self.screenShareManager = nil; } - (void)broadcastPaused { // User has requested to pause the broadcast. Samples will stop being delivered. [self.screenShareManager broadcastPaused]; } - (void)broadcastResumed { // User has requested to resume the broadcast. Samples delivery will resume. [self.screenShareManager broadcastResumed]; } - (void)broadcastFinished { // User has requested to finish the broadcast. [self.screenShareManager broadcastFinished]; } - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType { switch (sampleBufferType) { case RPSampleBufferTypeVideo: // Handle video sample buffer [self.screenShareManager processSampleBuffer:sampleBuffer withType:RPSampleBufferTypeVideo]; break; case RPSampleBufferTypeAudioApp: // Handle audio sample buffer for app audio break; case RPSampleBufferTypeAudioMic: // Handle audio sample buffer for mic audio break; default: break; } } - (void)screenShareManagerFinishBroadcastWithError:(NSError *)error { [self finishBroadcastWithError: error]; } @end