Avfoundation - Reproduza e grave vídeo (junto com áudio e visualização) simultaneamente

Estou tentando gravar e reproduzir vídeo simultaneamente. Isso é possível com avfoundation? Atualmente, eu sou capaz de fazê-lo, desde que eu não grave áudio. Assim que adiciono entrada de áudio ao AVCaptureSession e reinicio tudo, recebo "AVCaptureSessionWasInterruptedNotification" e a gravação é interrompida.

É assim que reproduzo o vídeo.

MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc]     initWithContentURL:[NSURL fileURLWithPath:path]];
[moviePlayer.view setFrame:self.playerView.bounds];
moviePlayer.useApplicationAudioSession=NO;
self.player = moviePlayer;

[moviePlayer release];

[self.playerView addSubview:player.view];

[player play];

E é assim que eu gravo vídeo:

NSError *error;

AVCamCaptureManager *captureManager = [[AVCamCaptureManager alloc] init];



if ([captureManager setupSessionWithPreset:AVCaptureSessionPresetLow error:&error])
{
    [self setCaptureManager:captureManager];



     AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[captureManager session]];
     self.captureVideoPreviewLayer= previewLayer;

     UIView *view = [self cameraView];
     CALayer *viewLayer = [view layer];
     [viewLayer setMasksToBounds:YES];

     CGRect bounds = [view bounds];


     [captureVideoPreviewLayer setFrame:bounds];

     if ([captureVideoPreviewLayer isOrientationSupported]) 
     [captureVideoPreviewLayer setOrientation:AVCaptureVideoOrientationPortrait];


     [captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];


   [[captureManager session] startRunning];

    [self setCaptureVideoPreviewLayer:captureVideoPreviewLayer];

    if ([[captureManager session] isRunning])
    {
        [captureManager setOrientation:AVCaptureVideoOrientationPortrait];
        [captureManager setDelegate:self];


        [viewLayer insertSublayer:captureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];

        NSString *countString = [[NSString alloc] initWithFormat:@"%d", [[AVCaptureDevice devices] count]];
        NSLog(@"Device count: %@",countString);


    } else {
        UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"Failure"
                                                            message:@"Failed to start session."
                                                           delegate:nil
                                                  cancelButtonTitle:@"Okay"
                                                  otherButtonTitles:nil];
        [alertView show];
        [alertView release];

    }
} else {
    UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"Input Device Init Failed"
                                                        message:[error localizedDescription]
                                                       delegate:nil
                                              cancelButtonTitle:@"Okay"
                                              otherButtonTitles:nil];
    [alertView show];
    [alertView release];        
}

[captureManager release];
if (![[self captureManager] isRecording]) {
    [[self captureManager] startRecording];
} 

Onde estou usando o "AVCamCaptureManager" do código de exemplo da Apple AVCa

questionAnswers(2)

yourAnswerToTheQuestion