我正在使用IOS应用将照片上传到服务器。重要的是照片上传时不会有质量损失,并以jpeg的形式上传。我目前的问题是上传的照片没有质量损失但文件大小超过预期。例如:我通过应用程序上传了一个文件,文件大小为4.7 MB。当我通过电子邮件发送同一张照片给自己并为电子邮件选择“实际照片”选项时,照片的大小仅为1.7 MB。并列比较显示质量没有差异。
以下是我上传文件的方式。
ALAssetsLibrary *library = [ALAssetsLibrary new];
[library getImageAtURL:orderImage.imageUrl with completionBlock:^(UIImage *image)
NSData *fileData = UIImageJPEGRepresentation(image, 1.0)
NSURLRequest *request = [self multipartFormRequestWithMethod:@"POST" path:path parameters:nil constructingBodyWithBlock:^(id<AFMultipartFormData> formData)
{
[formData appendPartWithFileData:fileData name:@"uploadedfile" fileName:fileName mimeType:mimeType];
[formData appendPartWithFormData:[extraInfo dataUsingEncoding:NSISOLatin2StringEncoding] name:@"extraInfo"];
}];
答案 0 :(得分:5)
问题是UIImageJPEGRepresentation
。它不会检索原始JPEG,而是创建一个新的JPEG。当您使用compressionQuality
1
时(大概是为了避免进一步的图像质量损失),它会创建这种没有压缩的新表示(通常会导致文件大于原始文件)。
我建议使用getBytes
检索原始资产,而不是通过UIImage
对其进行四舍五入并通过UIImageJPEGRepresentation
获取数据:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetsLibraryURL resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data stream
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
while (bytesRemaining > 0) {
NSUInteger bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(@"error reading asset representation: %@", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(@"error=%@", error);
}];
-
如果您使用的是iOS 8中引入的Photos框架,可以使用PHImageManager
来获取图片数据:
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:@[assetsLibraryURL] options:nil];
PHAsset *asset = [result firstObject];
if (asset) {
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
// use `imageData` here
}];
}