Live Photo is my favorite new feature on the latest iPhone 6S family. A Live Photo is a hybrid of a normal photo and a short video clip, which is fantastic to capture moments before and after the actual photo. I love that this new feature allows me to capture moments I’d otherwise miss. Sometimes the context of a photo is much more interesting than the photo itself:

Still Photo

A photo I took during my road trip

GIF

Before and after part. I love how my friend was left out of the photo but you can still see him in the video.

From a developer’s perspective, I was curious about Apple’s implementation. There wasn’t much details about it after the keynote except this video on TechCrunch. Knowing Apple, I knew Live Photo format will be exclusive to Apple’s devices. I knew there will be demand for apps to convert Live Photos to other formats that’s shareable with other platforms. iOS 9.1 beta was dropped on the same day of the Keynote, I took a quick look at the documentation and confirmed it was indeed a normal photo and a video clip.

It was easy to access Live Photo data. You can query all the Live Photos using this snippet:

PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.predicate = [NSPredicate predicateWithFormat: @"(mediaSubtype == %ld)", PHAssetMediaSubtypePhotoLive];
options.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey: @"creationDate" ascending: NO]];
PHFetchResult<PHAsset *> *assetsFetchResults = [PHAsset fetchAssetsWithMediaType: PHAssetMediaTypeImage options: options];

Then you can use the PHAsset to query the video and image data using PHAssetResourceManager:

PHAssetResourceManager *assetResourceManager = [PHAssetResourceManager defaultManager];
NSArray *assetResources = [PHAssetResource assetResourcesForAsset: asset];
NSMutableData *mutableData = [[NSMutableData alloc] init];
	
[assetResources enumerateObjectsUsingBlock:^(PHAssetResource *obj, NSUInteger idx, BOOL * _Nonnull stop) {
	if (obj2.type == PHAssetResourceTypePairedVideo) {
		[assetResourceManager requestDataForAssetResource: obj options: nil dataReceivedHandler:^(NSData * _Nonnull data) {
			//do something with video data
		}
	}
	else if (obj2.type == PHAssetResourceTypePhoto) {
		[assetResourceManager requestDataForAssetResource: obj options: nil dataReceivedHandler:^(NSData * _Nonnull data) {
			//do something with image data
		}
	} 
}];

There are 2 new constants in iOS 9.1 which enable accessing these data: PHAssetMediaSubtypePhotoLive and PHAssetResourceTypePairedVideo.

In 2 weeks after getting our 6S, we had a completed version of Lively (convert Live Photos to GIF/Movie) and were waiting for iOS 9.1 which was supposed to come out with Apple TV. Then I had a thought about the constants, which are just integer. Looking at iOS 9.1 header files, value for PHAssetMediaSubtypePhotoLive is 8 and PHAssetResourceTypePairedVideo is 9. I grabbed my iPad with iOS 9.0.2 and tried to call the method with the constant value in integer and voila, it returned the same thing as iOS 9.1 1. We then shipped Lively and Lean with iOS 9.0. It was an interesting hack for public API, enabled us to ship these apps a lot earlier. Hope this helps someone out there who wants to support Live Photo on iOS 9.0 for their apps.

TL;DR: use the snippets above and replace PHAssetMediaSubtypePhotoLive with number 8 and PHAssetResourceTypePairedVideo with number 9 to access image and video data of Live Photos on iOS 9.0.

P/S: we’re looking for beta tester for our new apps: Live Paper - Make Live Photo Wallpapers from GIF and Movie. If you’re using an iPhone 6S/6S Plus, you should definitely check it out.


  1. There is a minor bug when query all the Live Photos on iOS 9.0, it actually returns Live Photos that you’ve marked as non-Live in Photos.app. This is fixed in iOS 9.1.