文档章节

iOS自定义相机/参数调节/视频速率调节/视频合并

海二少
 海二少
发布于 2017/10/10 15:58
字数 2988
阅读 43
收藏 0

AVFoundation框架

1.AVAsset:用于获取一个多媒体文件的信息,相当于获取一个视频或音频文件,是一个抽象类,不能直接使用。

2.AVURLAsset:AVAsset的子类,通过URL路径创建一个包含多媒体信息的对象。

NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];

3.AVCaptureSession:用于捕捉视频和音频,负责协调视频和音频的输入流和输出流。

AVCaptureSession *captureSession = [[AVCaptureSession alloc] init]; 
if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) _captureSession.sessionPreset = AVCaptureSessionPreset1280x720;

4.AVCaptureDevice:表示输入设备,如照相机或麦克风。

AVCaptureDevice *device = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];

- (AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition)position {
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if ([device position] == position) {
            if ([device supportsAVCaptureSessionPreset:AVCaptureSessionPreset1280x720]) return device;
            return nil;
        }
    }
    return nil;
}

5.AVCaptureDeviceInput:视频或音频的输入流,把该对象添加到AVCaptureSession对象中管理。

NSError *error;
AVCaptureDeviceInput *input =
        [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
    // Handle the error appropriately.
}
if ([captureSession canAddInput:captureDeviceInput]) {
    [captureSession addInput:captureDeviceInput];
}

6.AVCaptureOutput:视频或音频的输出流,通常使用它的子类:AVCaptureAudioDataOutput,AVCaptureVideoDataOutput,AVCaptureStillImageOutput,AVCaptureFileOutput等,把该对象添加到AVCaptureSession对象中管理。

AVCaptureMovieFileOutput *movieOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([captureSession canAddOutput:movieOutput]) {
    [captureSession addOutput:movieOutput];
}

7.AVCaptureVideoPreviewLayer:预览图层,实时查看摄像头捕捉的画面。

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = <#Set layer frame#>;

8.AVCaptureConnection:AVCaptureSession和输入输出流之间的连接,可以用来调节一些设置,如光学防抖。

AVCaptureConnection *captureConnection = [movieOutput connectionWithMediaType:AVMediaTypeVideo];
// 打开影院级光学防抖
captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;

9.AVCaptureDeviceFormat:输入设备的一些设置,可以用来修改一些设置,如ISO,慢动作,防抖等。

AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
NSError *error;
if ([captureDevice lockForConfiguration:&error]) {
    CGFloat minISO = captureDevice.activeFormat.minISO;
    CGFloat maxISO = captureDevice.activeFormat.maxISO;
    // 调节ISO为全范围的70%
    CGFloat currentISO = (maxISO - minISO) * 0.7 + minISO;
    [captureDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];
    [captureDevice unlockForConfiguration];
}else{
    // Handle the error appropriately.
}

初始化相机

/// 负责输入和输出设备之间的数据传递
@property (nonatomic, strong) AVCaptureSession *captureSession;
/// 负责从AVCaptureDevice获得视频输入流
@property (nonatomic, strong) AVCaptureDeviceInput *captureDeviceInput;
/// 负责从AVCaptureDevice获得音频输入流
@property (nonatomic, strong) AVCaptureDeviceInput *audioCaptureDeviceInput;
/// 视频输出流
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureMovieFileOutput;
/// 相机拍摄预览图层
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;

... 公用方法

/// 获取摄像头设备
- (AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition)position {
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if ([device position] == position) {
            if ([device supportsAVCaptureSessionPreset:AVCaptureSessionPreset1280x720]) return device;
            return nil;
        }
    }
    return nil;
}

... 创建自定义相机

// 创建AVCaptureSession
_captureSession = [[AVCaptureSession alloc] init];
if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) _captureSession.sessionPreset = AVCaptureSessionPreset1280x720;

// 获取摄像设备
AVCaptureDevice *videoCaptureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
if (!videoCaptureDevice)  {
    // Handle the error appropriately.
}

// 获取视频输入流
NSError *error = nil;
_captureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if (error) {
    // Handle the error appropriately.
}

// 获取录音设备
AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];

// 获取音频输入流
_audioCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (error) {
    // Handle the error appropriately.
}

// 将视频和音频输入添加到AVCaptureSession
if ([_captureSession canAddInput:_captureDeviceInput] && [_captureSession canAddInput:_audioCaptureDeviceInput]) {
    [_captureSession addInput:_captureDeviceInput];
    [_captureSession addInput:_audioCaptureDeviceInput];
}

// 创建输出流
_captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

// 将输出流添加到AVCaptureSession
if ([_captureSession canAddOutput:_captureMovieFileOutput]) {
    [_captureSession addOutput:_captureMovieFileOutput];
    // 根据设备输出获得连接
    AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    // 判断是否支持光学防抖
    if ([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeCinematic]) {
        // 如果支持防抖就打开防抖
        captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
    }
}

// 保存默认的AVCaptureDeviceFormat
// 之所以保存是因为修改摄像头捕捉频率之后,防抖就无法再次开启,试了下只能够用这个默认的format才可以,所以把它存起来,关闭慢动作拍摄后在设置会默认的format开启防抖
_defaultFormat = videoCaptureDevice.activeFormat;
_defaultMinFrameDuration = videoCaptureDevice.activeVideoMinFrameDuration;
_defaultMaxFrameDuration = videoCaptureDevice.activeVideoMaxFrameDuration;

// 创建预览图层
_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//填充模式
_captureVideoPreviewLayer.frame = self.bounds;

// 相机的预览图层是一个CALayer,所以可以创建一个UIView,在view的layer上addSublayer就可以
// 因为这里是写在view的init方法里,所以直接调用了self.layer的addSublayer方法
[self.layer addSublayer:_captureVideoPreviewLayer];

// 开始捕获
[self.captureSession startRunning];

配置操作界面

可以在相机的预览图层所在的view上面直接addSubview我们需要的视图,我的做法是直接创建一个和当前预览图层一样大的UIView做控制面板,背景色为透明。然后整体盖在相机预览图层上面,所有的手势方法,按钮点击等都在我们的控制面板上作响应,具体代码其实就是通过代理传递控制面板的操作让相机界面去做对应的处理,这里就不贴无用代码了。

相机设置

1.切换到后摄像头

#pragma mark - 切换到后摄像头
- (void)cameraBackgroundDidClickChangeBack {
    AVCaptureDevice *toChangeDevice;
    AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionBack;
    toChangeDevice = [self getCameraDeviceWithPosition:toChangePosition];
    AVCaptureDeviceInput *toChangeDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:toChangeDevice error:nil];
    [self.captureSession beginConfiguration];
    [self.captureSession removeInput:self.captureDeviceInput];
    if ([self.captureSession canAddInput:toChangeDeviceInput]) {
        [self.captureSession addInput:toChangeDeviceInput];
        self.captureDeviceInput = toChangeDeviceInput;
    }
    [self.captureSession commitConfiguration];
}

2.切换到前摄像头

- (void)cameraBackgroundDidClickChangeFront {
    AVCaptureDevice *toChangeDevice;
    AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionFront;
    toChangeDevice = [self getCameraDeviceWithPosition:toChangePosition];
    AVCaptureDeviceInput *toChangeDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:toChangeDevice error:nil];
    [self.captureSession beginConfiguration];
    [self.captureSession removeInput:self.captureDeviceInput];
    if ([self.captureSession canAddInput:toChangeDeviceInput]) {
        [self.captureSession addInput:toChangeDeviceInput];
        self.captureDeviceInput = toChangeDeviceInput;
    }
    [self.captureSession commitConfiguration];
}

3.打开闪光灯

- (void)cameraBackgroundDidClickOpenFlash {
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        if ([captureDevice isTorchModeSupported:AVCaptureTorchModeOn]) [captureDevice setTorchMode:AVCaptureTorchModeOn];
    }else{
        // Handle the error appropriately.
    }
}

4.关闭闪光灯

- (void)cameraBackgroundDidClickCloseFlash {
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        if ([captureDevice isTorchModeSupported:AVCaptureTorchModeOff]) [captureDevice setTorchMode:AVCaptureTorchModeOff];
    }else{
        // Handle the error appropriately.
    }
}

5.调节焦距

// 焦距范围0.0-1.0
- (void)cameraBackgroundDidChangeFocus:(CGFloat)focus {
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        if ([captureDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]) [captureDevice setFocusModeLockedWithLensPosition:focus completionHandler:nil];
    }else{
        // Handle the error appropriately.
    }
}

6.数码变焦

// 数码变焦 1-3倍
- (void)cameraBackgroundDidChangeZoom:(CGFloat)zoom {
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        [captureDevice rampToVideoZoomFactor:zoom withRate:50];
    }else{
        // Handle the error appropriately.
    }
}

7.调节ISO,光感度

// 调节ISO,光感度 0.0-1.0
- (void)cameraBackgroundDidChangeISO:(CGFloat)iso {
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        CGFloat minISO = captureDevice.activeFormat.minISO;
        CGFloat maxISO = captureDevice.activeFormat.maxISO;
        CGFloat currentISO = (maxISO - minISO) * iso + minISO;
        [captureDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];
        [captureDevice unlockForConfiguration];
    }else{
        // Handle the error appropriately.
    }
}

8.点击屏幕自动对焦

// 当前屏幕上点击的点坐标
- (void)cameraBackgroundDidTap:(CGPoint)point {
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        CGPoint location = point;
        CGPoint pointOfInerest = CGPointMake(0.5, 0.5);
        CGSize frameSize = self.captureVideoPreviewLayer.frame.size;
        if ([captureDevice position] == AVCaptureDevicePositionFront) location.x = frameSize.width - location.x;
        pointOfInerest = CGPointMake(location.y / frameSize.height, 1.f - (location.x / frameSize.width));
        [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:pointOfInerest];

        [[self.captureDeviceInput device] addObserver:self forKeyPath:@"ISO" options:NSKeyValueObservingOptionNew context:NULL];
    }else{
        // Handle the error appropriately.
    }
}

-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        if ([captureDevice isFocusModeSupported:focusMode]) [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
        if ([captureDevice isFocusPointOfInterestSupported]) [captureDevice setFocusPointOfInterest:point];
        if ([captureDevice isExposureModeSupported:exposureMode]) [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];
        if ([captureDevice isExposurePointOfInterestSupported]) [captureDevice setExposurePointOfInterest:point];
    }else{
        // Handle the error appropriately.
    }
}

9.获取录制时视频的方向
因为相机的特殊性,不能够用常规的控制器的方向来获取当前的方向,因为用户可能关闭屏幕旋转,这里用重力感应来计算当前手机的放置状态。

@property (nonatomic, strong) CMMotionManager *motionManager;
@property (nonatomic, assign) UIDeviceOrientation deviceOrientation;
...

_motionManager = [[CMMotionManager alloc] init];
_motionManager.deviceMotionUpdateInterval = 1/15.0;
if (_motionManager.deviceMotionAvailable) {
    [_motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMDeviceMotion * _Nullable motion, NSError * _Nullable error) {
        [self performSelectorOnMainThread:@selector(handleDeviceMotion:) withObject:motion waitUntilDone:YES];
    }];
} else {
    NSLog(@"No device motion on device");
}

...

/// 重力感应回调
- (void)handleDeviceMotion:(CMDeviceMotion *)deviceMotion{
    double x = deviceMotion.gravity.x;
    double y = deviceMotion.gravity.y;

    CGAffineTransform videoTransform;

    if (fabs(y) >= fabs(x)) {
        if (y >= 0) {
            videoTransform = CGAffineTransformMakeRotation(M_PI);
            _deviceOrientation = UIDeviceOrientationPortraitUpsideDown;
        } else {
            videoTransform = CGAffineTransformMakeRotation(0);
            _deviceOrientation = UIDeviceOrientationPortrait;
        }
    } else {
        if (x >= 0) {
            videoTransform = CGAffineTransformMakeRotation(-M_PI_2);
            _deviceOrientation = UIDeviceOrientationLandscapeRight;    // Home键左侧水平拍摄
        } else {
            videoTransform = CGAffineTransformMakeRotation(M_PI_2);
            _deviceOrientation = UIDeviceOrientationLandscapeLeft;     // Home键右侧水平拍摄
        }
    }
    // 告诉操作界面当前屏幕的方向,做按钮跟随屏幕方向旋转的操作
    [self.backgroundView setOrientation:_deviceOrientation];
}

11.慢动作拍摄

- (void)cameraBackgroundDidClickOpenSlow {
    [self.captureSession stopRunning];
    CGFloat desiredFPS = 240.0;
    AVCaptureDevice *videoDevice = self.captureDeviceInput.device;
    AVCaptureDeviceFormat *selectedFormat = nil;
    int32_t maxWidth = 0;
    AVFrameRateRange *frameRateRange = nil;
    for (AVCaptureDeviceFormat *format in [videoDevice formats]) {
        for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
            CMFormatDescriptionRef desc = format.formatDescription;
            CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
            int32_t width = dimensions.width;
            if (range.minFrameRate <= desiredFPS && desiredFPS <= range.maxFrameRate && width >= maxWidth) {
                selectedFormat = format;
                frameRateRange = range;
                maxWidth = width;
            }
        }
    }
    if (selectedFormat) {
        if ([videoDevice lockForConfiguration:nil]) {
            NSLog(@"selected format: %@", selectedFormat);
            videoDevice.activeFormat = selectedFormat;
            videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
            videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
            [videoDevice unlockForConfiguration];
        }
    }
    [self.captureSession startRunning];
}

12.慢动作拍摄关

- (void)cameraBackgroundDidClickCloseSlow {
    [self.captureSession stopRunning];
    CGFloat desiredFPS = 60.0;
    AVCaptureDevice *videoDevice = self.captureDeviceInput.device;
    AVCaptureDeviceFormat *selectedFormat = nil;
    int32_t maxWidth = 0;
    AVFrameRateRange *frameRateRange = nil;
    for (AVCaptureDeviceFormat *format in [videoDevice formats]) {
        for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
            CMFormatDescriptionRef desc = format.formatDescription;
            CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
            int32_t width = dimensions.width;
            if (range.minFrameRate <= desiredFPS && desiredFPS <= range.maxFrameRate && width >= maxWidth) {
                selectedFormat = format;
                frameRateRange = range;
                maxWidth = width;
            }
        }
    }
    if (selectedFormat) {
        if ([videoDevice lockForConfiguration:nil]) {
            NSLog(@"selected format: %@", selectedFormat);
            videoDevice.activeFormat = _defaultFormat;
            videoDevice.activeVideoMinFrameDuration = _defaultMinFrameDuration;
            videoDevice.activeVideoMaxFrameDuration = _defaultMaxFrameDuration;
            [videoDevice unlockForConfiguration];
        }
    }
    [self.captureSession startRunning];
}

13.防抖开启

- (void)cameraBackgroundDidClickOpenAntiShake {
    AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    NSLog(@"change captureConnection: %@", captureConnection);
    AVCaptureDevice *videoDevice = self.captureDeviceInput.device;
    NSLog(@"set format: %@", videoDevice.activeFormat);
    if ([videoDevice.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeCinematic]) {
        captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
    }
}

14.防抖关闭

#pragma mark - 防抖关
- (void)cameraBackgroundDidClickCloseAntiShake {
    AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    NSLog(@"change captureConnection: %@", captureConnection);
    AVCaptureDevice *videoDevice = self.captureDeviceInput.device;
    if ([videoDevice.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeOff]) {
        captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff;
    }
}

15.录制视频

#pragma mark - 录制
- (void)cameraBackgroundDidClickPlay {
    // 根据设备输出获得连接
    AVCaptureConnection *captureConnection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    // 根据连接取得设备输出的数据
    if (![self.captureMovieFileOutput isRecording]) {
        captureConnection.videoOrientation = (AVCaptureVideoOrientation)_deviceOrientation; // 视频方向和手机方向一致
        NSString *outputFilePath = [kCachePath stringByAppendingPathComponent:[self movieName]];
        NSURL *fileURL = [NSURL fileURLWithPath:outputFilePath];
        [self.captureMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
        _currentMoviePath = outputFilePath;
    }
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections {
    NSLog(@"开始录制");
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
    NSLog(@"录制完成");
}

16.暂停录制

[self.captureMovieFileOutput stopRecording];

17.调节视频的速度
慢动作拍摄的时候要调节摄像头的捕捉频率,快速的时候直接调节视频速度就可以了。
慢动作下拍摄的视频视频的播放时长还是实际拍摄的时间,这里根据设置的慢速倍率,把视频的时长拉长。

/// 处理速度视频
- (void)setSpeedWithVideo:(NSDictionary *)video completed:(void(^)())completed {
    dispatch_async(dispatch_get_global_queue(0, 0), ^{
        NSLog(@"video set thread: %@", [NSThread currentThread]);
        // 获取视频
        AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:video[kMoviePath]] options:nil];
        // 视频混合
        AVMutableComposition* mixComposition = [AVMutableComposition composition];
        // 视频轨道
        AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        // 音频轨道
        AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

        // 视频的方向
        CGAffineTransform videoTransform = [videoAsset tracksWithMediaType:AVMediaTypeVideo].lastObject.preferredTransform;
        if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
            NSLog(@"垂直拍摄");
            videoTransform = CGAffineTransformMakeRotation(M_PI_2);
        }else if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
            NSLog(@"倒立拍摄");
            videoTransform = CGAffineTransformMakeRotation(-M_PI_2);
        }else if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
            NSLog(@"Home键右侧水平拍摄");
            videoTransform = CGAffineTransformMakeRotation(0);
        }else if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
            NSLog(@"Home键左侧水平拍摄");
            videoTransform = CGAffineTransformMakeRotation(M_PI);
        }
        // 根据视频的方向同步视频轨道方向
        compositionVideoTrack.preferredTransform = videoTransform;
        compositionVideoTrack.naturalTimeScale = 600;

        // 插入视频轨道
        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject] atTime:kCMTimeZero error:nil];
        // 插入音频轨道
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:nil];

        // 适配视频速度比率
        CGFloat scale = 1.0;
        if([video[kMovieSpeed] isEqualToString:kMovieSpeed_Fast]){
            scale = 0.2f;  // 快速 x5
        } else if ([video[kMovieSpeed] isEqualToString:kMovieSpeed_Slow]) {
            scale = 4.0f;  // 慢速 x4
        }

        // 根据速度比率调节音频和视频
        [compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) toDuration:CMTimeMake(videoAsset.duration.value * scale , videoAsset.duration.timescale)];
        [compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(videoAsset.duration.value, videoAsset.duration.timescale)) toDuration:CMTimeMake(videoAsset.duration.value * scale, videoAsset.duration.timescale)];

        // 配置导出
        AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
        // 导出视频的临时保存路径
        NSString *exportPath = [kCachePath stringByAppendingPathComponent:[self movieName]];
        NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];

        // 导出视频的格式 .MOV
        _assetExport.outputFileType = AVFileTypeQuickTimeMovie;
        _assetExport.outputURL = exportUrl;
        _assetExport.shouldOptimizeForNetworkUse = YES;

        // 导出视频
        [_assetExport exportAsynchronouslyWithCompletionHandler:
         ^(void ) {
             dispatch_async(dispatch_get_main_queue(), ^{
                 [_processedVideoPaths addObject:exportPath];
                 // 将导出的视频保存到相册
                 ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
                 if (![library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL URLWithString:exportPath]]){
                     NSLog(@"cache can't write");
                     completed();
                     return;
                 }
                 [library writeVideoAtPathToSavedPhotosAlbum:[NSURL URLWithString:exportPath] completionBlock:^(NSURL *assetURL, NSError *error) {
                     if (error) {
                         completed();
                         NSLog(@"cache write error");
                     } else {
                         completed();
                         NSLog(@"cache write success");
                     }
                 }];
             });
         }];
    });
}

18.将多个视频合并为一个视频

- (void)mergeVideosWithPaths:(NSArray *)paths completed:(void(^)(NSString *videoPath))completed {
    if (!paths.count) return;

    dispatch_async(dispatch_get_main_queue(), ^{
        AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
        AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        videoTrack.preferredTransform = CGAffineTransformRotate(CGAffineTransformIdentity, M_PI_2);

        CMTime totalDuration = kCMTimeZero;

//        NSMutableArray<AVMutableVideoCompositionLayerInstruction *> *instructions = [NSMutableArray array];

        for (int i = 0; i < paths.count; i++) {
            AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL fileURLWithPath:paths[i]]];


            AVAssetTrack *assetAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
            AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo]firstObject];

            NSLog(@"%lld", asset.duration.value/asset.duration.timescale);

            NSError *erroraudio = nil;
            BOOL ba = [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetAudioTrack atTime:totalDuration error:&erroraudio];
            NSLog(@"erroraudio:%@--%d", erroraudio, ba);

            NSError *errorVideo = nil;

            BOOL bl = [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetVideoTrack atTime:totalDuration error:&errorVideo];
            NSLog(@"errorVideo:%@--%d",errorVideo,bl);

//            AVMutableVideoCompositionLayerInstruction *instruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
//            UIImageOrientation assetOrientation = UIImageOrientationUp;
//            BOOL isAssetPortrait = NO;
//            // 根据视频的实际拍摄方向来调整视频的方向
//            CGAffineTransform videoTransform = assetVideoTrack.preferredTransform;
//            if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
//                NSLog(@"垂直拍摄");
//                assetOrientation = UIImageOrientationRight;
//                isAssetPortrait = YES;
//            }else if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
//                NSLog(@"倒立拍摄");
//                assetOrientation = UIImageOrientationLeft;
//                isAssetPortrait = YES;
//            }else if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
//                NSLog(@"Home键右侧水平拍摄");
//                assetOrientation = UIImageOrientationUp;
//            }else if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
//                NSLog(@"Home键左侧水平拍摄");
//                assetOrientation = UIImageOrientationDown;
//            }
//            CGFloat assetScaleToFitRatio = 720.0 / assetVideoTrack.naturalSize.width;
//            if (isAssetPortrait) {
//                assetScaleToFitRatio = 720.0 / assetVideoTrack.naturalSize.height;
//                CGAffineTransform assetSacleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio, assetScaleToFitRatio);
//                [instruction setTransform:CGAffineTransformConcat(assetVideoTrack.preferredTransform, assetSacleFactor) atTime:totalDuration];
//            } else {
//                /**
//                 竖直方向视频尺寸:720*1280
//                 水平方向视频尺寸:720*405
//                 水平方向视频需要剧中的y值:(1280 - 405)/ 2 = 437.5
//                 **/
//                CGAffineTransform assetSacleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio, assetScaleToFitRatio);
//                [instruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(assetVideoTrack.preferredTransform, assetSacleFactor), CGAffineTransformMakeTranslation(0, 437.5)) atTime:totalDuration];
//            }
//            // 把新的插入到最上面,最后是按照数组顺序播放的。
//            [instructions insertObject:instruction atIndex:0];
//            totalDuration = CMTimeAdd(totalDuration, asset.duration);
//            // 在当前视频时间点结束后需要清空尺寸,否则如果第二个视频尺寸比第一个小,它会显示在第二个视频的下方。
//            [instruction setCropRectangle:CGRectZero atTime:totalDuration];
        }

//        AVMutableVideoCompositionInstruction *mixInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
//        mixInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration);
//        mixInstruction.layerInstructions = instructions;

//        AVMutableVideoComposition *mixVideoComposition = [AVMutableVideoComposition videoComposition];
//        mixVideoComposition.instructions = [NSArray arrayWithObject:mixInstruction];
//        mixVideoComposition.frameDuration = CMTimeMake(1, 25);
//        mixVideoComposition.renderSize = CGSizeMake(720.0, 1280.0);
//
        NSString *outPath = [kVideoPath stringByAppendingPathComponent:[self movieName]];
        NSURL *mergeFileURL = [NSURL fileURLWithPath:outPath];

        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
        exporter.outputURL = mergeFileURL;
        exporter.outputFileType = AVFileTypeQuickTimeMovie;
//        exporter.videoComposition = mixVideoComposition;
        exporter.shouldOptimizeForNetworkUse = YES;
        [exporter exportAsynchronouslyWithCompletionHandler:^{
             dispatch_async(dispatch_get_main_queue(), ^{
                 completed(outPath);
             });
         }];
    });
}

 

本文转载自:http://www.jianshu.com/p/06ed571fb3b5

海二少
粉丝 0
博文 87
码字总数 48854
作品 0
程序员
私信 提问
iOS-tutorial-16:TravelVlogs——使用AVKit和AVFoundation创建视频应用

TravelVlogs项目是介绍如何使用AVKit和AVFoundation框架构建简单的视频流应用。 预览图 代码 TravelVlogs 注意点 AVKit 介绍 一个有用的开发智慧:始终支持您可用的最高抽象级别。 然后,当您...

Andy_Ron
2018/11/07
0
0
iOS/安卓全能视频播放器推荐

前言 前几天有朋友下载了些 ts 格式的高清电视剧,但是没法在 iPad(iOS) 上播放,只好先在电脑上转换成 MP4 格式,转换起来很花时间,结果也不如意。 其实现在各平台都有全格式支持的播放器,...

己立
2018/07/22
0
0
iOS如何不通过手势响应器去调节自定义相机中预览画面的画面缩放?

我们知道,iOS在图片缩放上使用了手势相应器(UIGestureRecognizer),已知的有这些手势相应器: UIPanGestureRecognizer(拖动) UIPinchGestureRecognizer(捏合) UIRotationGestureRecog...

FishAndMilk
2016/08/09
411
0
iOS渲染-将视频原始数据(RGB,YUV)渲染到屏幕上

需求 在做如美颜,滤镜等功能时,我们不能使用相机原生的,而是需要通过其他方式将视频原始帧数据如RGB,NV12等等渲染到iOS界面上. 实现原理 利用OpenGL完成高效的渲染功能.本例中仅提供简单流程...

小东邪啊
06/18
0
0
iOS神级浏览器Alook端午节限免

前言 今天在 v2ex.com 论坛上看到有用户推荐iOS 端 Alook 浏览器,而且评价很不错,目前正在端午节限时免费中,分享给使用 iOS 的小伙伴们。 Alook Alook 浏览器最具特色的功能是 在线音/视频...

己立
2018/06/17
0
0

没有更多内容

加载失败,请刷新页面

加载更多

MySQL左连接问题,右表做筛选,左表列依然在

两张表,一张user表,一张user_log表 CREATE TABLE `user` ( `id` int(11) NOT NULL AUTO_INCREMENT, `name` varchar(20) DEFAULT NULL, PRIMARY KEY (`id`)) ENGINE=InnoDB DEFA......

bengozhong
10分钟前
2
0
重新开始学Java——多线程基础

多线程 进程 主流计算机操作系统都支持同时运行多个任务 , 每个任务通常就是一个程序 , 每个运行中的程序就是一个进程或者多个进程 。 进程的特点 独立性 进程是系统中独立存在的实体 可以...

大家都是低调来的
11分钟前
1
0
注解在Java中是如何工作的?

> 来一点咖啡,准备好进入注解的世界。 注解一直是 Java 的一个非常重要的部分,它从 J2SE 5.0 开始就已经存在了。在我们的应用程序代码中,经常看到 @Override 和 @Deprecated 这样的注解。...

liululee
13分钟前
3
0
Docker 容器连接

Docker 容器连接 容器间的链接有两种方法,你选择其一即可 网络端口映射 docker run -d -P docker run -d -p-P :是容器内部端口随机映射到主机的高端口。-p : 是容器内部端口绑定到指定...

测者陈磊
16分钟前
3
0
车载导航应用中基于Sketch UI主题定制方案的实现

1.导读 关于应用的主题定制,相信大家或多或少都有接触,基本上,实现思路可以分为两类: 内置主题(应用内自定义style) 外部加载方式(资源apk形式、压缩资源、插件等) 其实,针对不同的主题...

阿里云官方博客
21分钟前
7
0

没有更多内容

加载失败,请刷新页面

加载更多

返回顶部
顶部