1. 程式人生 > 實用技巧 >[譯] iOS視訊拍攝

[譯] iOS視訊拍攝

原文:https://www.objc.io/issues/23-video/capturing-video/

隨著每個版本的處理器和相機硬體效能的提升,使用 iPhone 拍攝視訊變得越來越有趣。 它們體積小、重量輕,與專業攝像機的質量差距已經縮小了許多,在某些情況下,iPhone 是絕對的備用攝像機。本文介紹各種不同的引數,用於配製捕獲視訊的管道(pipeline),以充分利用硬體。一個簡單的App演示了不同管道的實現,可在GitHub檢視。

UIImagePickerController

到目前為止,在你的app中整合視訊捕獲的最簡易方法是,使用UIImagePickerController。它是一個 view controller,封裝了完整的視訊捕獲管道和相機介面。

在初始化相機之前,首先檢查當前裝置是否支援視訊錄製:

if ([UIImagePickerController
       isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
    NSArray *availableMediaTypes = [UIImagePickerController
      availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
    if ([availableMediaTypes containsObject:(NSString *)kUTTypeMovie]) {
        // Video recording is supported.
    }
}

Then create a UIImagePickerController object, and define a delegate to further process recorded videos (e.g. to save them to the camera roll) and respond to the user dismissing the camera:
然後建立一個UIImagePickerController物件,並且定義一個delegate,為進一步處理視訊錄製做準備(比如,將它們儲存到相機膠捲),並且對使用者關閉相機做出相應的響應。

UIImagePickerController *camera = [UIImagePickerController new];
camera.sourceType = UIImagePickerControllerSourceTypeCamera;
camera.mediaTypes = @[(NSString *)kUTTypeMovie];
camera.delegate = self;

以上就是呼叫一個功能齊全的相機所需的全部程式碼。

配置相機

UIImagePickerController提供了額外配置引數。
設定 cameraDevice 屬性可以指定使用某個攝像頭,它接收一個列舉型別UIImagePickerControllerCameraDevice。預設情況下,該屬性被設定為UIImagePickerControllerCameraDeviceRear(後攝像頭),但它也可以設定為UIImagePickerControllerCameraDeviceFront(前攝像頭)。記得先檢查你想要使用的攝像頭是否存在:

UIImagePickerController *camera = …
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
    [camera setCameraDevice:UIImagePickerControllerCameraDeviceFront];
}

videoQuality屬性可以控制錄製視訊的質量。它允許你修改編碼預設,這會影響視訊的位元率和解析度。 共有六個預設:

enum {
   UIImagePickerControllerQualityTypeHigh             = 0,
   UIImagePickerControllerQualityTypeMedium           = 1,  // default  value
   UIImagePickerControllerQualityTypeLow              = 2,
   UIImagePickerControllerQualityType640x480          = 3,
   UIImagePickerControllerQualityTypeIFrame1280x720   = 4,
   UIImagePickerControllerQualityTypeIFrame960x540    = 5
};
typedef NSUInteger  UIImagePickerControllerQualityType;

前 3 個是相對預設(低、中、高)。對於不同裝置,這些預設實際產生的編碼配置可能是不同的,hight 為你提供所選相機的最高質量。其它 3 個是指定解析度的預算(640x480 VGA, 960x540 iFrame, 和 1280x720 iFrame)。

自定義UI

正如前文所說,UIImagePickerController提供了開箱即用的完整相機UI。然而,你也可以自定義相機介面,你可以隱藏預設控制元件,並用自定義view來控制相機。自定義的view將顯示在‘相機預覽檢視’之上。

UIView *cameraOverlay = …
picker.showsCameraControls = NO;
picker.cameraOverlayView = cameraOverlay;

然後,你需要將自定義控制元件掛鉤到UIImagePickerController的控制方法。(如: startVideoCapture 和 stopVideoCapture).

AVFoundation

假如你想要對視訊錄製過程有更多的控制,你需要使用AVFoundation庫。

AVFoundation庫中,處理視訊錄製最核心的類是AVCaptureSession。它協調音訊、視訊之間的輸入和輸出資料流:

要使用捕獲會話,你需要例項化它,新增輸入和輸出,並啟動會話:

AVCaptureSession *captureSession = [AVCaptureSession new];
AVCaptureDeviceInput *cameraDeviceInput = …
AVCaptureDeviceInput *micDeviceInput = …
AVCaptureMovieFileOutput *movieFileOutput = …
if ([captureSession canAddInput:cameraDeviceInput]) {
    [captureSession addInput:cameraDeviceInput];
}
if ([captureSession canAddInput:micDeviceInput]) {
    [captureSession addInput:micDeviceInput];
}
if ([captureSession canAddOutput:movieFileOutput]) {
    [captureSession addOutput:movieFileOutput];
}

[captureSession startRunning];

(為了精簡,以上程式碼省略掉了dispatch queue相關程式碼。因為AVCaptureSession的所有方法呼叫都是會阻塞執行緒的,所以建議把它們分配到子執行緒處理。)

使用AVCaptureSession.sessionPreset屬性,可以配置輸出檔案的質量。以下有 11 個不同的選項。

NSString *const  AVCaptureSessionPresetPhoto;
NSString *const  AVCaptureSessionPresetHigh;
NSString *const  AVCaptureSessionPresetMedium;
NSString *const  AVCaptureSessionPresetLow;
NSString *const  AVCaptureSessionPreset352x288;
NSString *const  AVCaptureSessionPreset640x480;
NSString *const  AVCaptureSessionPreset1280x720;
NSString *const  AVCaptureSessionPreset1920x1080;
NSString *const  AVCaptureSessionPresetiFrame960x540;
NSString *const  AVCaptureSessionPresetiFrame1280x720;
NSString *const  AVCaptureSessionPresetInputPriority;

The first one is for high-resolution photo output. The next nine are very similar to the UIImagePickerControllerQualityType options we saw for the videoQuality setting of UIImagePickerController, with the exception that there are a few additional presets available for a capture session. The last one (AVCaptureSessionPresetInputPriority) indicates that the capture session does not control the audio and video output settings. Instead, the activeFormat of the connected capture device dictates the quality level at the outputs of the capture session. In the next section, we will look at devices and device formats in more detail.
第一個用於輸出高解析度照片。隨後 9 個與UIImagePickerController中用於設定 videoQualityUIImagePickerControllerQualityType選項非常相似。最後一個AVCaptureSessionPresetInputPriority,表示會話不控制音訊和視訊輸出設定。
在下一節中,我們將更詳細地介紹裝置和視訊格式。

Inputs
The inputs for an AVCaptureSession are one or more AVCaptureDevice objects connected to the capture session through an AVCaptureDeviceInput.

We can use [AVCaptureDevice devices] to find the available capture devices. For an iPhone 6, they are:

(
    “<AVCaptureFigVideoDevice: 0x136514db0 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>”,
    “<AVCaptureFigVideoDevice: 0x13660be80 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>”,
    “<AVCaptureFigAudioDevice: 0x174265e80 [iPhone Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>”
)