iOS ALAsset image metadata

Posted: July 10, 2011 by Ash Mishra in Programming

iOS provides a built-in Assets Library, which stores media that you have sycned with iTunes, or stored in your Camera roll. These are usually photos or videos.

The main iOS classes for working with media Assets are the following:

  1. ALAssetsLibrary – which provides access to Photo Groups by category type. (ALAssetsGroupLibrary, ALAssetsGroupAlbum, ALAssetsGroupEvent, ALAssetsGroupFaces, ALAssetsGroupSavedPhotos, ALAssetsGroupAll)
  2.  After picking an ALAssetsLibrary, you can enumerate through the enclosed ALAssetsGroup objects
  3. An ALAssetsGroup object holds ALAssets, which are a wrapper representing a single image or video.

Other posts cover the details on using enumerators and blocks for getting a set of ALAssets, so I will discuss retrieving metadata.  This can be done in two ways:

  1. Call [asset metadata] which returns a dictionary containing metadata for an asset, including most of its EXIF content. When I tried this call, it did indeed work, but you may have some memory issues to deal with unless you ensure the calls are single-threaded.
  2. Use CGImageSourceCopyPropertiesAtIndex to retrieve individual properties from a CGImageSourceRef; this approach allows you to specify exact properties. See example code after the break below on retrieving a property from an ALAsset using this approach.
My tests for fetching metadata show that while option 2 is more code, it is several magnitudes faster than the much simpler [asset metadata] call in option 1.

As an aside, creating a CGImageSourceRef from an ALAssets is easy when it is a jpeg.  However, if you are dealing with other types of media formats, you will get an error when creating a CGImageSourceRef without a source hint identifying the type of image; this is done by passing in the ALAssetRepresentation UTI.

ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];
// create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size);// copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0  length:representation.size error:nil];

if (length==0) return nil;

// convert the buffer into a NSData object, free the buffer after

NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];

// setup a dictionary with a UTI hint.  The UTI hint identifies the type of image we are dealing with (ie. a jpeg, png, or a possible RAW file)

// specify the source hint

NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:

(id)[representation UTI] ,kCGImageSourceTypeIdentifierHint,


// create a CGImageSource with the NSData.  A image source can contain x number of thumbnails and full images.

CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata,  (CFDictionaryRef) sourceOptionsDict);

[adata release];

CFDictionaryRef imagePropertiesDictionary;

// get a copy of the image properties from the CGImageSourceRef

imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);

CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);

CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);

int w = 0;

int h = 0;

CFNumberGetValue(imageWidth, kCFNumberIntType, &w);

CFNumberGetValue(imageHeight, kCFNumberIntType, &h);

// cleanup memory



  1. Dave S. says:

    I like this description of doing it via CGImageRef. And I appreciate that you’ve said that it’s much faster. Here’s my question: I can add an ALAssetsGroup to the library. I can add an asset to the group. All of this code works great. Here’s the prob. When I use setImageData on the newly added asset the failure block runs with a code of 0 and the domain and userinfo set to NULL.

    Any ideas? What I’m trying to do is create a new asset and put it in a new group all programmatically.

  2. aviellazar says:

    Thanks for the post.
    I’m a bit new in Objective-c yet I’m fiddling with the AssetLibrary and specifically the metadata of the assets. I’m trying to extract the metadata of the latest 100 photos and it takes me around 6 seconds, also when using your technique.
    First question – why does reading the metadata takes so long? I didn’t figure out what’s internally going on.
    Second question – Do you a way to know whether new photos were added to the asset library while the app wasn’t active? I’ve read than iOS 5 the asset url ids aren’t sequential.


    • Ash Mishra says:

      The metadata takes quite long because internally the API is loading each image, and then extracting the metadata. If you don’t need all the detailed metadata, you could use -valueForProperty on ALAsset; though this doesn’t even give basic information about width / height unfortunately.

      The way I am checking if new photos are added, is storing guids based on the ALAsset date and the ALAssetRepresentation size. Each time the app launches you could compare the list of ALAsset guids with the previous guids you have saved.

  3. aviellazar says:

    Thanks for the quick response! I actually only needed the location and the date and I missed the valueForProperty!!!! WOW – how could I’ve missed it.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s