Using UIImageJPEGRepresentation
(in which you are round-tripping the asset through a UIImage
) can be problematic, because using a compressionQuality
of 1.0, the resulting NSData
can actually be considerably larger than the original file. (Plus, you're holding a second copy of the image in the UIImage
.)
For example, I just picked a random image from my iPhone's photo library and the original asset was 1.5mb, but the NSData
produced by UIImageJPEGRepresentation
with a compressionQuality
of 1.0 required 6.2mb. And holding the image in UIImage
, itself, might take even more memory (because if uncompressed, it can require, for example, four bytes per pixel).
Instead, you can get the original asset using the getBytes
method:
static NSInteger kBufferSize = 1024 * 10;
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL *url = info[UIImagePickerControllerReferenceURL];
[self.library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
long long remaining = representation.size;
NSString *filename = representation.filename;
long long representationOffset = 0ll;
NSError *error;
NSMutableData *data = [NSMutableData data];
uint8_t buffer[kBufferSize];
while (remaining > 0ll) {
NSInteger bytesRetrieved = [representation getBytes:buffer fromOffset:representationOffset length:sizeof(buffer) error:&error];
if (bytesRetrieved <= 0) {
NSLog(@"failed getBytes: %@", error);
return;
} else {
remaining -= bytesRetrieved;
representationOffset += bytesRetrieved;
[data appendBytes:buffer length:bytesRetrieved];
}
}
// you can now use the `NSData`
} failureBlock:^(NSError *error) {
NSLog(@"assetForURL error = %@", error);
}];
}
This avoids staging the image in a UIImage
and the resulting NSData
can be (for photos, anyway) considerably smaller. Note, this also has an advantage that it preserves the meta data associated with the image, too.
By the way, while the above represents a significant memory improvement, you can probably see a more dramatic memory reduction opportunity: Specifically, rather than loading the entire asset into a NSData
at one time, you can now stream the asset (subclass NSInputStream
to use this getBytes
routine to fetch bytes as they're needed, rather than loading the whole thing into memory at one time). There are some annoyances involved with this process (see BJ Homer's article on the topic), but if you're looking for dramatic reduction in the memory footprint, that's the way. There are a couple of approaches here (BJ's, using some staging file and streaming from that, etc.), but the key is that streaming can dramatically reduce your memory footprint.
But by avoiding UIImage
in UIImageJPEGRepresentation
(which avoids the memory taken up by the image as well as the larger NSData
that UIImageJPEGRepresentation
yields), you might be able to make considerably headway. Also, you might want to make sure that you don't have redundant copies of this image data in memory at one time (e.g. don't load the image data into a NSData
, and then build a second NSData
for the HTTPBody
... see if you can do it in one fell swoop). And if worst comes to worse, you can pursue streaming approaches.