Apple Resources:
Media Capture

The following are typically required

AVCaptureDevice – represents the input device (camera or microphone)
AVCaptureInput – (a concrete subclass of) to configure the ports from the input device (has one or more input ports which are instances of AVCaptureInputPort)
AVCaptureOutput – (a concrete subclass of) to manage the output to a movie file or still image (accepts data from one or more sources, e.g. an AVCaptureMovieFileOutput object accepts both video and audio data)
AVCaptureSession – coordinates the data flow from the input to the output
AVCaptureVideoPreviewLayer – shows the user what a camera is recording
AVCaptureConnection – connection between a capture input and a capture output in a capture session. Can be used to enable or disable the flow of data from a given input or to a given output. Also to monitor the average and peak power levels in audio channels.


A capture session posts notifications that you can observe to be notified, for example, when it starts or stops running, or when it is interrupted. You can also register to receive an AVCaptureSessionRuntimeErrorNotification if a runtime error occurs. You can also interrogate the session’s running property to find out if it is running, and its interrupted property to find out if it is interrupted.

A Working Implementation

Its surprisingly hard to find good working examples of recording video with AVCaptureSession. This is the results of us coming up with a nice and simple implementation which works and can be expanded on as required. All the basics are there to add clever things to get at the the audio and video data if desired.
This example may be used with a standard ViewController class with .xib.
It shows a preview of the camera in landscape orientation. Two buttons can be added to your xib for start/stop recording and toggle camera.

In your ViewController.h

#import <UIKit/UIKit.h>

#import <Foundation/Foundation.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>

#import <AssetsLibrary/AssetsLibrary.h>		//<<Can delete if not storing videos to the photo library.  Delete the assetslibrary framework too requires this)


@interface MyViewController_iPhone : UIViewController
	BOOL WeAreRecording;
	AVCaptureSession *CaptureSession;
	AVCaptureMovieFileOutput *MovieFileOutput;
	AVCaptureDeviceInput *VideoInputDevice;

@property (retain) AVCaptureVideoPreviewLayer *PreviewLayer;

- (void) CameraSetOutputProperties;
- (AVCaptureDevice *) CameraWithPosition:(AVCaptureDevicePosition) Position;
- (IBAction)StartStopButtonPressed:(id)sender;
- (IBAction)CameraToggleButtonPressed:(id)sender;

In your ViewController.m

#import "MyViewController_iPhone.h"

@implementation MyViewController_iPhone

@synthesize PreviewLayer;

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        // Custom initialization
    return self;

//********** VIEW DID LOAD **********
- (void)viewDidLoad
    [super viewDidLoad];
	NSLog(@"Setting up capture session");
	CaptureSession = [[AVCaptureSession alloc] init];
	//----- ADD INPUTS -----
	NSLog(@"Adding video input");
	AVCaptureDevice *VideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
	if (VideoDevice)
		NSError *error;
		VideoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
		if (!error)
			if ([CaptureSession canAddInput:VideoInputDevice])
				[CaptureSession addInput:VideoInputDevice];
				NSLog(@"Couldn't add video input");
			NSLog(@"Couldn't create video input");
		NSLog(@"Couldn't create video capture device");
	NSLog(@"Adding audio input");
	AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
	NSError *error = nil;
	AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
	if (audioInput)
		[CaptureSession addInput:audioInput];
	//----- ADD OUTPUTS -----
	NSLog(@"Adding video preview layer");
	[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:CaptureSession] autorelease]];
	PreviewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;		//<<SET ORIENTATION.  You can deliberatly set this wrong to flip the image and may actually need to set it wrong to get the right image
	[[self PreviewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
	NSLog(@"Adding movie file output");
	MovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
	Float64 TotalSeconds = 60;			//Total seconds
	int32_t preferredTimeScale = 30;	//Frames per second
	CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);	//<<SET MAX DURATION
	MovieFileOutput.maxRecordedDuration = maxDuration;
	MovieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;						//<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME
	if ([CaptureSession canAddOutput:MovieFileOutput])
		[CaptureSession addOutput:MovieFileOutput];

	//SET THE CONNECTION PROPERTIES (output properties)
	[self CameraSetOutputProperties];			//(We call a method as it also has to be done after changing camera)

	//	AVCaptureSessionPresetHigh - Highest recording quality (varies per device)
	//	AVCaptureSessionPresetMedium - Suitable for WiFi sharing (actual values may change)
	//	AVCaptureSessionPresetLow - Suitable for 3G sharing (actual values may change)
	//	AVCaptureSessionPreset640x480 - 640x480 VGA (check its supported before setting it)
	//	AVCaptureSessionPreset1280x720 - 1280x720 720p HD (check its supported before setting it)
	//	AVCaptureSessionPresetPhoto - Full photo resolution (not supported for video output)
	NSLog(@"Setting image quality");
	[CaptureSession setSessionPreset:AVCaptureSessionPresetMedium];
	if ([CaptureSession canSetSessionPreset:AVCaptureSessionPreset640x480])		//Check size based configs are supported before setting them
		[CaptureSession setSessionPreset:AVCaptureSessionPreset640x480];

	//Display it full screen under out view controller existing controls
	NSLog(@"Display the preview layer");
	CGRect layerRect = [[[self view] layer] bounds];
	[PreviewLayer setBounds:layerRect];
	[PreviewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),
	//[[[self view] layer] addSublayer:[[self CaptureManager] previewLayer]];
	//We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front):
	UIView *CameraView = [[[UIView alloc] init] autorelease];
	[[self view] addSubview:CameraView];
	[self.view sendSubviewToBack:CameraView];
	[[CameraView layer] addSublayer:PreviewLayer];
	[CaptureSession startRunning];

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
    return (interfaceOrientation == UIDeviceOrientationLandscapeLeft);

//********** VIEW WILL APPEAR **********
//View about to be added to the window (called each time it appears)
//Occurs after other view's viewWillDisappear
- (void)viewWillAppear:(BOOL)animated
	[super viewWillAppear:animated];
	WeAreRecording = NO;

//********** CAMERA SET OUTPUT PROPERTIES **********
- (void) CameraSetOutputProperties
	//SET THE CONNECTION PROPERTIES (output properties)
	AVCaptureConnection *CaptureConnection = [MovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
	//Set landscape (if required)
	if ([CaptureConnection isVideoOrientationSupported])
		AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationLandscapeRight;		//<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
		[CaptureConnection setVideoOrientation:orientation];
	//Set frame rate (if requried)
	if (CaptureConnection.supportsVideoMinFrameDuration)
		CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
	if (CaptureConnection.supportsVideoMaxFrameDuration)
		CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);

- (AVCaptureDevice *) CameraWithPosition:(AVCaptureDevicePosition) Position
	NSArray *Devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
	for (AVCaptureDevice *Device in Devices)
		if ([Device position] == Position)
			return Device;
	return nil;

//********** CAMERA TOGGLE **********
- (IBAction)CameraToggleButtonPressed:(id)sender
	if ([[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] > 1)		//Only do if device has multiple cameras
		NSLog(@"Toggle camera");
		NSError *error;
		//AVCaptureDeviceInput *videoInput = [self videoInput];
		AVCaptureDeviceInput *NewVideoInput;
		AVCaptureDevicePosition position = [[VideoInputDevice device] position];
		if (position == AVCaptureDevicePositionBack)
			NewVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self CameraWithPosition:AVCaptureDevicePositionFront] error:&error];
		else if (position == AVCaptureDevicePositionFront)
			NewVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self CameraWithPosition:AVCaptureDevicePositionBack] error:&error];
		if (NewVideoInput != nil)
			[CaptureSession beginConfiguration];		//We can now change the inputs and output configuration.  Use commitConfiguration to end
			[CaptureSession removeInput:VideoInputDevice];
			if ([CaptureSession canAddInput:NewVideoInput])
				[CaptureSession addInput:NewVideoInput];
				VideoInputDevice = NewVideoInput;
				[CaptureSession addInput:VideoInputDevice];
			//Set the connection properties again
			[self CameraSetOutputProperties];
			[CaptureSession commitConfiguration];
			[NewVideoInput release];

//********** START STOP RECORDING BUTTON **********
- (IBAction)StartStopButtonPressed:(id)sender
	if (!WeAreRecording)
		//----- START RECORDING -----
		WeAreRecording = YES;
		//Create temporary URL to record to
		NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @""];
		NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
		NSFileManager *fileManager = [NSFileManager defaultManager];
		if ([fileManager fileExistsAtPath:outputPath])
			NSError *error;
			if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
				//Error - handle if requried
		[outputPath release];		
		//Start recording
		[MovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
		[outputURL release];
		//----- STOP RECORDING -----
		WeAreRecording = NO;
		[MovieFileOutput stopRecording];

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
	  fromConnections:(NSArray *)connections
				error:(NSError *)error

	NSLog(@"didFinishRecordingToOutputFileAtURL - enter");
    BOOL RecordedSuccessfully = YES;
    if ([error code] != noErr)
        // A problem occurred: Find out if the recording was successful.
        id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
        if (value)
            RecordedSuccessfully = [value boolValue];
	if (RecordedSuccessfully)
			NSLog(@"didFinishRecordingToOutputFileAtURL - success");
		ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
		if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
			[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
										completionBlock:^(NSURL *assetURL, NSError *error)
				if (error)


		[library release];		

//********** VIEW DID UNLOAD **********
- (void)viewDidUnload
	[super viewDidUnload];
	[CaptureSession release];
	CaptureSession = nil;
	[MovieFileOutput release];
	MovieFileOutput = nil;
	[VideoInputDevice release];
	VideoInputDevice = nil;

//********** DEALLOC **********
- (void)dealloc
	[CaptureSession release];
	[MovieFileOutput release];
	[VideoInputDevice release];

	[super dealloc];


Storing Directly To Your Documents Directory

(i.e. instead of moving file to photo library after recording completes)

	NSString *DestFilename = @ "";
	//Set the file save to URL
	NSLog(@"Starting recording to file: %@", DestFilename);
	NSString *DestPath;
	NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
	DestPath = [[paths objectAtIndex:0] stringByAppendingPathComponent:RECORD_TO_ADD_DIRECTORY];
	DestPath = [DestPath stringByAppendingPathComponent:DestFilename];
	NSURL* saveLocationURL = [[NSURL alloc] initFileURLWithPath:DestPath];
	[MovieFileOutput startRecordingToOutputFileURL:saveLocationURL recordingDelegate:self];
	[saveLocationURL release];

We benefit hugely from resources on the web so we decided we should try and give back some of our knowledge and resources to the community by opening up many of our company’s internal notes and libraries through mini sites like this. We hope you find the site helpful.

Please feel free to comment if you can add help to this page or point out issues and solutions you have found, but please note that we do not provide support on this site. If you need help with a problem please use one of the many online forums.

  • Eric Frazer

    I’d like to know how to display the video capture preview in a sub window on the main view, at half the size of the current display’s width/height. I’m trying to do this, but I can’t figure out the coordinate space. Noob question, I know, but i still can’t figure it out.

  • sejal

    Can i play video after recording video?if yes than how?

  • Ahmed Hosny Sayed

    Thanks a lot for these so helpful scripts , but have you ever tried them with Augmented Reality , especially with Vuforia SDK , can you record video of what is being captured and the result video of tracking the object ?

  • Ahmed Hosny Sayed

    I meant to record the screen , currently I’m achieving this by taking screenshots of the screen , cause Vuforia AR result happens on screen itself ,also it takes control of default camera , so setting video in put device to default will no longer work, it’s occupied by AR one

  • Ips Brar

    Hi I am facing one problem using this code.
    I’m using this code in my app which is developed in the Landscape Mode.
    My video recording with rear camera works fine but the when I play the Video Recorded with FRONT camera it always plays UpSide Down.

    Please help, I’m using the exact same code in my project.

  • rameshwar kumavat

    its possible to present preview layer in background thread

  • erdikanik

    Thank you for nice tutuorial!

  • Hiren Dave

    Thank you so much for the article.

  • res1233

    Pretty obscene how much code it takes to express such a simple concept like “I want to record video and do something with it”. I think that most idiots could come up with an easier to use API than what Apple provides.