iOS開發-相簿視訊編輯裁剪
iOS相簿視訊編輯有兩種方式,一種是使用系統自帶的控制器UIVideoEditorController
但是該類只提供了基礎的視訊編輯功能,介面十分有限,介面樣式沒法修改,效果如下圖。
UIVideoEditorController
和UIImagePickerController
的視訊顯示介面十分相似,區別就是前者可以編輯,後者不能。
第二種方式就是利用強大的AVFoudation
框架自己動手實現。
實現邏輯主要分5步:
* 1.AVPlayer迴圈播放視訊
如果是整段視訊迴圈播放,有兩種實現方式,一種是KVO監聽AVPlayer
的timeControlStatus
屬性,
typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {
AVPlayerTimeControlStatusPaused,
AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate,
AVPlayerTimeControlStatusPlaying
} NS_ENUM_AVAILABLE(10_12, 10_0);
當狀態為AVPlayerTimeControlStatusPaused
的時候讓player回到起點並繼續播放。
[self.player seekToTime:CMTimeMake(0, 1)] ;
[self.player play];
第二種迴圈播放方式是,利用計時器設定要播放時長,並迴圈執行計時器方法。
- (void)repeatPlay{
[self.player play];
CMTime start = CMTimeMakeWithSeconds(self.startTime, self.player.currentTime.timescale);
[self.player seekToTime:start toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
編輯視訊時的視訊段迴圈播放,顯然只能通過第二種方式實現。
- 2.以1秒為單位,獲取視訊幀影象
編輯區域需要顯示視訊幀影象,通過AVAssetImageGenerator
這個類來獲取,在該類的獲取視訊幀影象介面呼叫時需要傳入要獲取視訊幀影象的時間節點。
- (void)generateCGImagesAsynchronouslyForTimes:(NSArray<NSValue *> *)requestedTimes completionHandler:(AVAssetImageGeneratorCompletionHandler)handler;
在視訊編輯功能中,一般的時間節點都是以1秒為單位獲取視訊幀影象。在該介面回撥中由於是非同步執行,所以需要在回撥中直接顯示圖片,詳細程式碼實現如下。
#pragma mark 讀取解析視訊幀
- (void)analysisVideoFrames{
//初始化asset物件
AVURLAsset *videoAsset = [[AVURLAsset alloc]initWithURL:self.videoUrl options:nil];
//獲取總視訊的長度 = 總幀數 / 每秒的幀數
long videoSumTime = videoAsset.duration.value / videoAsset.duration.timescale;
//建立AVAssetImageGenerator物件
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc]initWithAsset:videoAsset];
generator.maximumSize = bottomView.frame.size;
generator.appliesPreferredTrackTransform = YES;
generator.requestedTimeToleranceBefore = kCMTimeZero;
generator.requestedTimeToleranceAfter = kCMTimeZero;
// 新增需要幀數的時間集合
self.framesArray = [NSMutableArray array];
for (int i = 0; i < videoSumTime; i++) {
CMTime time = CMTimeMake(i *videoAsset.duration.timescale , videoAsset.duration.timescale);
NSValue *value = [NSValue valueWithCMTime:time];
[self.framesArray addObject:value];
}
NSMutableArray *imgArray = [NSMutableArray array];
__block long count = 0;
[generator generateCGImagesAsynchronouslyForTimes:self.framesArray completionHandler:^(CMTime requestedTime, CGImageRef img, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result == AVAssetImageGeneratorSucceeded) {
NSLog(@"%ld",count);
UIImageView *thumImgView = [[UIImageView alloc] initWithFrame:CGRectMake(50+count*self.IMG_Width, 0, self.IMG_Width, 70)];
thumImgView.image = [UIImage imageWithCGImage:img];
dispatch_async(dispatch_get_main_queue(), ^{
[editScrollView addSubview:thumImgView];
editScrollView.contentSize = CGSizeMake(100+count*self.IMG_Width, 0);
});
count++;
}
if (result == AVAssetImageGeneratorFailed) {
NSLog(@"Failed with error: %@", [error localizedDescription]);
}
if (result == AVAssetImageGeneratorCancelled) {
NSLog(@"AVAssetImageGeneratorCancelled");
}
}];
[editScrollView setContentOffset:CGPointMake(50, 0)];
}
3.新增編輯檢視,並控制AVPlayer迴圈播放該時間區域視訊段
視訊編輯框,我的思路是左右新增一個檢視,根據在其父檢視上的新增的拖拽手勢,如果當前觸點在編輯框檢視上,則根據其父檢視拖動的距離調整編輯框位置,並調整播放視訊的起止時間。4.監聽編輯框和視訊幀滑動,並調整AVPlayer迴圈播放的視訊段
編輯框的位置移動第3步已經說了,視訊幀影象是放到UIScrollView
上的,這裡也可以用UICollectionView
實現,關於滑動區域的位置監聽和編輯框移動時視訊播放區間的調整的實現邏輯稍複雜些,稍後會附上Demo地址,大家可以詳細看程式碼實現,這裡就貼手勢處理的部分程式碼。
#pragma mark 編輯區域手勢拖動
- (void)moveOverlayView:(UIPanGestureRecognizer *)gesture{
switch (gesture.state) {
case UIGestureRecognizerStateBegan:
{
[self stopTimer];
BOOL isRight = [rightDragView pointInsideImgView:[gesture locationInView:rightDragView]];
BOOL isLeft = [leftDragView pointInsideImgView:[gesture locationInView:leftDragView]];
_isDraggingRightOverlayView = NO;
_isDraggingLeftOverlayView = NO;
self.touchPointX = [gesture locationInView:bottomView].x;
if (isRight){
self.rightStartPoint = [gesture locationInView:bottomView];
_isDraggingRightOverlayView = YES;
_isDraggingLeftOverlayView = NO;
}
else if (isLeft){
self.leftStartPoint = [gesture locationInView:bottomView];
_isDraggingRightOverlayView = NO;
_isDraggingLeftOverlayView = YES;
}
}
break;
case UIGestureRecognizerStateChanged:
{
CGPoint point = [gesture locationInView:bottomView];
// Left
if (_isDraggingLeftOverlayView){
CGFloat deltaX = point.x - self.leftStartPoint.x;
CGPoint center = leftDragView.center;
center.x += deltaX;
CGFloat durationTime = (SCREEN_WIDTH-100)*2/10; // 最小範圍2秒
BOOL flag = (self.endPointX-point.x)>durationTime;
if (center.x >= (50-SCREEN_WIDTH/2) && flag) {
leftDragView.center = center;
self.leftStartPoint = point;
self.startTime = (point.x+editScrollView.contentOffset.x)/self.IMG_Width;
topBorder.frame = CGRectMake(self.boderX+=deltaX/2, 0, self.boderWidth-=deltaX/2, 2);
bottomBorder.frame = CGRectMake(self.boderX+=deltaX/2, 50-2, self.boderWidth-=deltaX/2, 2);
self.startPointX = point.x;
}
CMTime startTime = CMTimeMakeWithSeconds((point.x+editScrollView.contentOffset.x)/self.IMG_Width, self.player.currentTime.timescale);
// 只有視訊播放的時候才能夠快進和快退1秒以內
[self.player seekToTime:startTime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
else if (_isDraggingRightOverlayView){ // Right
CGFloat deltaX = point.x - self.rightStartPoint.x;
CGPoint center = rightDragView.center;
center.x += deltaX;
CGFloat durationTime = (SCREEN_WIDTH-100)*2/10; // 最小範圍2秒
BOOL flag = (point.x-self.startPointX)>durationTime;
if (center.x <= (SCREEN_WIDTH-50+SCREEN_WIDTH/2) && flag) {
rightDragView.center = center;
self.rightStartPoint = point;
self.endTime = (point.x+editScrollView.contentOffset.x)/self.IMG_Width;
topBorder.frame = CGRectMake(self.boderX, 0, self.boderWidth+=deltaX/2, 2);
bottomBorder.frame = CGRectMake(self.boderX, 50-2, self.boderWidth+=deltaX/2, 2);
self.endPointX = point.x;
}
CMTime startTime = CMTimeMakeWithSeconds((point.x+editScrollView.contentOffset.x)/self.IMG_Width, self.player.currentTime.timescale);
// 只有視訊播放的時候才能夠快進和快退1秒以內
[self.player seekToTime:startTime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
else { // 移動scrollView
CGFloat deltaX = point.x - self.touchPointX;
CGFloat newOffset = editScrollView.contentOffset.x + deltaX;
CGPoint currentOffSet = CGPointMake(newOffset, 0);
if (currentOffSet.x >= 0 && currentOffSet.x <= (editScrollView.contentSize.width-SCREEN_WIDTH)) {
editScrollView.contentOffset = CGPointMake(newOffset, 0);
self.touchPointX = point.x;
}
}
}
break;
case UIGestureRecognizerStateEnded:
{
[self startTimer];
}
default:
break;
}
}
- # 補充小細節
只有在視訊播放時,呼叫AVPlayer
的
- (void)seekToTime:(CMTime)time toleranceBefore:(CMTime)toleranceBefore toleranceAfter:(CMTime)toleranceAfter;
這個接口才能實現小於1秒以內的快進和快退,如果是暫停狀態,不管如何傳時間值,都是以1秒為單位快進和快退。在開發過程中倒是發現了一個在暫停時可以以小於1秒的單位快進和快退的介面,是AVPlayerItem
這個類的這個介面,
- (void)stepByCount:(NSInteger)stepCount;
但是這個stepCount
值官方文件說的很含糊,見截圖
文件解釋:每一步的大小取決於接收機的功能avplayeritemtrack物件(參考[tracks
]),然而我列印了tracks,只有一個音訊軌跡一個視訊軌跡,實在沒找到有價值的東西,暫時放棄了這條路,好在視訊播放時時可以以小於1秒單位快進和快退,微信朋友圈也是一直迴圈播放可能也有這個原因在裡面。
- 5.完成時,根據編輯區域擷取視訊段存入相簿並獲取URL使用
最終完成視訊剪輯需要用到AVAssetExportSession
這個類,通過起止時間和原始檔,完成視訊的最終剪輯,程式碼如下。
#pragma mark 視訊裁剪
- (void)notifyDelegateOfDidChange{
self.tempVideoPath = [NSTemporaryDirectory() stringByAppendingPathComponent:@"tmpMov.mov"];
[self deleteTempFile];
AVAsset *asset = [AVAsset assetWithURL:self.videoUrl];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
NSURL *furl = [NSURL fileURLWithPath:self.tempVideoPath];
exportSession.outputURL = furl;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(self.startTime, self.player.currentTime.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.endTime - self.startTime, self.player.currentTime.timescale);;
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed: %@", [[exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export canceled");
break;
default:
NSLog(@"NONE");
NSURL *movieUrl = [NSURL fileURLWithPath:self.tempVideoPath];
dispatch_async(dispatch_get_main_queue(), ^{
UISaveVideoAtPathToSavedPhotosAlbum([movieUrl relativePath], self,@selector(video:didFinishSavingWithError:contextInfo:), nil);
NSLog(@"編輯後的視訊路徑: %@",self.tempVideoPath);
self.isEdited = YES;
[self invalidatePlayer];
[self initPlayerWithVideoUrl:movieUrl];
bottomView.hidden = YES;
});
break;
}
}];
}
- (void)video:(NSString*)videoPath didFinishSavingWithError:(NSError*)error contextInfo:(void*)contextInfo {
if (error) {
NSLog(@"儲存到相簿失敗");
} else {
NSLog(@"儲存到相簿成功");
}
}
- (void)deleteTempFile{
NSURL *url = [NSURL fileURLWithPath:self.tempVideoPath];
NSFileManager *fm = [NSFileManager defaultManager];
BOOL exist = [fm fileExistsAtPath:url.path];
NSError *err;
if (exist) {
[fm removeItemAtURL:url error:&err];
NSLog(@"file deleted");
if (err) {
NSLog(@"file remove error, %@", err.localizedDescription );
}
} else {
NSLog(@"no file by that name");
}
}
完成效果如下(手機螢幕投到電腦上的錄屏,所有手勢沒法顯示,大家可以下demo自己體驗):
MARK:新寫了Swift版,過程中遇到一些問題也一起分享下
- Swift對於數學運算的個別數值要求比較嚴,不同數位需要轉換,比如呼叫方法
public func CMTimeMakeWithSeconds(_ seconds: Float64, _ preferredTimescale: Int32) -> CMTime
就需要把傳入的非Float64
引數強轉
let startTim = CMTimeMakeWithSeconds(Float64(second), player.currentTime().timescale)
- swift需要丟擲異常的函式方法呼叫時的格式
無引數格式:
do{ try session.setActive(true) }
catch{}
有引數格式:
do{try filem.removeItem(at: url)}
catch let err as NSError {
error = err
}
反正不管遇到什麼問題,就是多看官方文件的相關說明基本都可以解決掉。吐槽下:swift的編譯和自動提示確實很慢,據說Xcode9會有所改觀。
OC-Demo地址
Swift-Demo地址 GitHub給個Star噢!
喜歡就點個讚唄!
歡迎大家提出更好的改進意見和建議,一起進步!