如何从NSMP和XMPP的头像数据创建UIImage?

时间:2010-07-01 19:48:39

标签: iphone sdk uiimage xmpp nsdata

此问题与Iphone SDK,NSData和UIImage有关。

我正在尝试从xmpp返回的头像数据中创建一个图像,如下所示:

<presence from='yyy@184.73.164.51/spark' to='ken@184.73.164.51/424978324712783686768453' id='Oj02v-45'><status>Away due to idle.</status><priority>0</priority><show>away</show><x xmlns='vcard-temp:x:update'><photo>a3f549fa9705e7ead2905de0b6a804227ecdd404</photo></x><x xmlns='jabber:x:avatar'><hash>a3f549fa9705e7ead2905de0b6a804227ecdd404</hash></x></presence>

所以在这种情况下,我假设a3f549fa9705e7ead2905de0b6a804227ecdd404是照片数据。 那么如何将其传输到NSData?

我想如果我能获得NSData对象, 我可以轻松创建UIImage,对吧?


我认为“a3f549fa9705e7ead2905de0b6a804227ecdd404”是照片数据 这是我的代码:

NSString* command = @"a3f549fa9705e7ead2905de0b6a804227ecdd404";
command = [command stringByReplacingOccurrencesOfString:@" " withString:@""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [command length]/2; i++) {
    byte_chars[0] = [command characterAtIndex:i*2];
    byte_chars[1] = [command characterAtIndex:i*2+1];
    whole_byte = strtol(byte_chars, NULL, 16);
    [commandToSend appendBytes:&whole_byte length:1]; 
}

UIImage *image = [UIImage imageWithData: commandToSend];

然而, 它不起作用。 有人知道它有什么问题吗?

2 个答案:

答案 0 :(得分:2)

在XMPPPresence.m中添加此方法

-(NSString *)photo {
    NSXMLElement *xElement = [self elementForName:@"x" xmlns:@"vcard-temp:x:update"];
    NSString *photoHash = [[xElement elementForName:@"photo"]stringValue];
    return photoHash;

}

//在XMPPStream的委托中:

- (void)xmppStream:(XMPPStream *)stream didReceivePresence:
(XMPPPresence *)presence {
        NSString *photoHash = [presence photo];
        if ([photoHash length] > 0) {   // in case when there's no photo hash
                XMPPJID *rosterJID = [presence from];
                BOOL requestPhoto = ... // determine if you need to request new
photo or nor
                if (requestPhoto) {
                        NSXMLElement *iqAvatar = [NSXMLElement elementWithName:@"iq"];

                        NSXMLElement *queryAvatar = [NSXMLElement elementWithName:@"vCard"
xmlns:@"vcard-temp"];
                        [iqAvatar addAttributeWithName:@"type" stringValue:@"get"];
                        [iqAvatar addAttributeWithName:@"to" stringValue:[rosterJID full]];
                        [iqAvatar addChild:queryAvatar];

                        XMPPIQ *avatarRequestIQ = [XMPPIQ iqFromElement:iqAvatar];
                        [stream sendElement:avatarRequestIQ];
                }
        }

}

//当好友发送照片时,它将采用BASE64编码的vcard。    //你会收到它作为智商:

- (BOOL)xmppStream:(XMPPStream *)stream didReceiveIQ:(XMPPIQ *)iq {
        XMPPElement *vCardPhotoElement = (XMPPElement *)[[iq
elementForName:@"vCard"] elementForName:@"PHOTO"];
        if (vCardPhotoElement != nil) {
                // avatar data
                NSString *base64DataString = [[vCardPhotoElement
elementForName:@"BINVAL"] stringValue];
                NSData *imageData = [NSData
dataFromBase64String:base64DataString];   // you need to get NSData
BASE64 category
                UIImage *avatarImage = [UIImage imageWithData:imageData];

                XMPPJID *senderJID = [iq from];
                [self xmppStream:stream didReceiveImage:avatarImage
forBuddy:senderJID];   // this is my custom delegate method where I
save new avatar to cache
        }
        return NO;

}

希望这会对你有所帮助。

答案 1 :(得分:1)

这就是你现在必须发送一个vcard请求的图片哈希,它将包含用于验证的相同哈希值和包含base64中图片数据的binval