GCDAsyncSocket没有收到所有传输的数据,丢失了最后的" Chunk"

时间:2014-04-04 18:38:17

标签: objective-c sockets osx-mountain-lion asyncsocket gcdasyncsocket

我正在尝试将一些字符串和图像数据从python脚本发送到在OSX上运行的目标C应用程序。

我正在使用GCDAsyncSocket收集传输的数据,并将其附加到NSMutableData,直到服务器断开连接。然后我处理NSData并将其拆分为原始部分。

传输的数据包括以下内容:

ID字符串,填写为16个字节。

图像编号字符串,填写为16个字节。

原始图像数据。

终止字符串,填写为16个字节。

问题是我没有接收/获取最后一块数据,我最终错过了JPEG图像的结尾,导致图像损坏(尽管显示最多),以及缺少终止字符串。

以下是我使用GCDAsyncSocket获取数据并处理数据的代码:

套接字连接:

- (void)socket:(GCDAsyncSocket *)sock didAcceptNewSocket:(GCDAsyncSocket *)newSocket
{
// This method is executed on the socketQueue (not the main thread)

@synchronized(connectedSockets)
{
    [connectedSockets addObject:newSocket];
}

NSString *host = [newSocket connectedHost];
UInt16 port = [newSocket connectedPort];

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        [self logInfo:FORMAT(@"Accepted client %@:%hu", host, port)];

    }
});

[newSocket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];

}

收到套接字数据

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
// This method is executed on the socketQueue (not the main thread)

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        NSLog(@"Thread Data Length is %lu", (unsigned long)[data length]);
        if (!imageBuffer){
            imageBuffer = [[NSMutableData alloc]init];
        }

        [imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
        NSLog(@"Total Data Length is %lu", (unsigned long)[imageBuffer length]);

    }
});

// Echo message back to client
[sock writeData:data withTimeout:-1 tag:ECHO_MSG];
    [sock readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}

套接字已断开

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
    dispatch_async(dispatch_get_main_queue(), ^{
        @autoreleasepool {

            [self logInfo:FORMAT(@"Client Disconnected")];
            NSData *cameraNumberData;
            NSData *imageNumberData;
            NSData *imageData;
            NSData *endCommandData;
            //if ([data length] > 40){
            cameraNumberData = [imageBuffer subdataWithRange:NSMakeRange(0, 16)];
            imageNumberData = [imageBuffer subdataWithRange:NSMakeRange(16, 16)];
            imageData = [imageBuffer subdataWithRange:NSMakeRange(32, [imageBuffer length]-34)];
            endCommandData = [imageBuffer subdataWithRange:NSMakeRange([imageBuffer length]-16, 16)];
            //}
            NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
            NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
            NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
            NSImage* image = [[NSImage alloc]initWithData:imageData];
            if (cameraNumberString)
            {
                NSLog(@"Image recieved from Camera no %@", cameraNumberString);
                [self logMessage:cameraNumberString];
            }
            else
            {
                [self logError:@"Error converting received data into UTF-8 String"];
            }

            if (imageNumberString)
            {
                NSLog(@"Image is number %@", imageNumberString);
                [self logMessage:imageNumberString];
            }
            else
            {
                [self logError:@"Error converting received data into UTF-8 String"];
            }

            if (image)
            {
                NSLog(@"We have an image");
                [self.imageView setImage:image];
            }
            else
            {
                [self logError:@"Error converting received data into image"];
            }

            if (endCommandString)
            {
                NSLog(@"Command String is %@", endCommandString);
                [self logMessage:endCommandString];
            }
            else
            {
                [self logError:@"No command string"];
            }

            //self.imageBuffer = nil;

        }
    });

        @synchronized(connectedSockets)
    {
        [connectedSockets removeObject:sock];
    }
}
}

我使用过wireshark,数据正在传输,它只是没有通过GCDAsynSocket。

所以,我显然遗漏了一些东西。对这样的数据进行套接字编程和编码/解码对我来说相对较新,所以我可能是个白痴。

非常感谢!

由于

加雷

1 个答案:

答案 0 :(得分:2)

好的,所以我终于有了这个工作。它涉及修改Python中的传输代码,以在数据末尾发送完成字符串,并观察它。最大的好处是每次socket读取一些数据时我都需要重新调用readDataToData:方法,否则它只会坐在那里等待,而发送套接字也只是坐在那里。

我还必须实现使用标记重新调用第二个接收,这样我就可以将接收到的数据存储在NSMutableArray中的正确NSMutableData对象中,否则我无法知道在第一次接收后哪个传输套接字数据是因为ID只出现在第一条消息的开头。

这是didReadData代码:

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        NSInteger cameraNumberNumber = 0;
        NSString *cameraNumberString = [[NSString alloc]init];

        if (tag > 10){

            cameraNumberNumber = tag-11;
            DDLogVerbose(@"Second data loop, tag is %ld", tag);
        } else {

        NSData *cameraNumberData;
        //if ([data length] > 40){
        cameraNumberData = [data subdataWithRange:NSMakeRange(0, 16)];
        NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
        cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
        cameraNumberNumber = [cameraNumberString intValue]-1;

        }

        if (cameraNumberNumber+1 <= self.images.count){

                if ([self.images objectAtIndex:cameraNumberNumber] == [NSNull null]){
                        image* cameraImage = [[image alloc]init];
                        [self.images replaceObjectAtIndex: cameraNumberNumber withObject:cameraImage];
                    }

                image* cameraImage = [self.images objectAtIndex:cameraNumberNumber];
                [cameraImage.imageData appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
                cameraImage.cameraNumber = cameraNumberString;

                if (!imageBuffer){
                        imageBuffer = [[NSMutableData alloc]init];
                    }


                [imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
                DDLogVerbose(@"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
        } else {

            DDLogInfo(@"Wrong camera quantity!");
            NSAlert *testAlert = [NSAlert alertWithMessageText:@"Wrong camera quantity!"
                                                 defaultButton:@"Ok"
                                               alternateButton:nil
                                                   otherButton:nil
                                     informativeTextWithFormat:@"We have recieved more images than cameras, please set No.Cameras correctly!"];

            [testAlert beginSheetModalForWindow:[self window]
                                  modalDelegate:self
                                 didEndSelector:@selector(stop)
                                    contextInfo:nil];

        }

                [sock readDataToData:[@"end" dataUsingEncoding:NSUTF8StringEncoding] withTimeout:-1 tag:cameraNumberNumber + 11];

    }

});
}

这里是socketDidDisconnect代码,这里有很多东西在上下文之外没有意义,但是它显示了我如何处理接收到的数据。

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
    dispatch_async(dispatch_get_main_queue(), ^{
        @autoreleasepool {
            totalCamerasFetched = [NSNumber numberWithInt:1+[totalCamerasFetched intValue]];
            if ([totalCamerasFetched integerValue] >= [numberOfCameras integerValue]){

                for (image* cameraImage in self.images){

                        NSData *cameraNumberData;
                        NSData *imageNumberData;
                        NSData *imageData;
                        NSData *endCommandData;
                        NSInteger cameraNumberNumber = 0;
                        cameraNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(0, 16)];
                        imageNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(16, 16)];
                        imageData = [cameraImage.imageData subdataWithRange:NSMakeRange(32, [cameraImage.imageData length]-32)];
                        endCommandData = [cameraImage.imageData subdataWithRange:NSMakeRange([cameraImage.imageData length]-16, 16)];
                        NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
                        cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
                        NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
                        imageNumberString = [imageNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
                        NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
                        NSImage* image = [[NSImage alloc]initWithData:imageData];
                        cameraNumberNumber = [cameraNumberString intValue]-1;






                        if (cameraNumberString)
                            {
                                    DDLogInfo(@"Image recieved from Camera no %@", cameraNumberString);
                            }
                        else
                        {
                                    DDLogError(@"No Camera number in data");
                        }

                        if (imageNumberString)
                        {
                                    DDLogInfo(@"Image is number %@", imageNumberString);
                        }
                        else
                        {
                                    DDLogError(@"No Image number in data");
                        }




                        if (image)
                        {

                        DDLogVerbose(@"We have an image");


                        NSString* dataPath = [[NSString alloc]initWithFormat:@"%@/image%@/",self.exportLocation, imageNumberString];

                        if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath]){

                                NSError* error;
                                [[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error];

                                if (error)
                                    {
                                            DDLogError(@"[%@] ERROR: attempting to write directory for images", [self class]);
                                            NSAssert( FALSE, @"Failed to create directory maybe out of disk space?");
                                        }
                            }

                        NSString* dataPathVideo = [[NSString alloc]initWithFormat:@"%@/video%@/",self.exportLocation, imageNumberString];

                        if (![[NSFileManager defaultManager] fileExistsAtPath:dataPathVideo]){

                                NSError* error;
                                [[NSFileManager defaultManager] createDirectoryAtPath:dataPathVideo withIntermediateDirectories:NO attributes:nil error:&error];

                                if (error)
                                {
                                    DDLogError(@"[%@] ERROR: attempting to write directory for images", [self class]);
                                    NSAssert( FALSE, @"Failed to create directory maybe out of disk space?");
                                }
                            }

                        NSString * exportLocationFull = [[NSString alloc]initWithFormat:@"%@/image%@/camera_%@.jpg",self.exportLocation, imageNumberString, cameraNumberString];
                            DDLogInfo(@"Full export URL = %@", exportLocationFull);
                        [imageData writeToFile:exportLocationFull atomically:YES];
                        self.currentSet = [NSNumber numberWithInt:[imageNumberString intValue]];

                        NSImage* imageToStore = [[NSImage alloc]initWithData:imageData];


                        [self.imagesToMakeVideo replaceObjectAtIndex: cameraNumberNumber withObject:imageToStore];


                        } else {
                            DDLogError(@"No image loacted in data");
                        }

                        if (endCommandString)
                        {
                            DDLogVerbose(@"Command String is %@", endCommandString);
                            //[self logMessage:endCommandString];
                        }
                        else
                        {
                            //[self logError:@"No command string"];
                        }

                        self.imageBuffer = nil;

                    }

                self.totalCamerasFetched = [NSNumber numberWithInt:0];
                [self loadandDisplayLatestImages];
                [self createVideowithImages:imagesToMakeVideo toLocation:[[NSString alloc]initWithFormat:@"%@/video%@/image_sequence_%@.mov",self.exportLocation, self.currentSet, self.currentSet]];
                processing = false;
            }//end of for loop
        }
    });

    @synchronized(connectedSockets)
    {
        [connectedSockets removeObject:sock];
    }
}

}

这里也是我如何修改Python代码以添加额外的“结束”标记。

def send_media_to(self, ip, port, media_name, media_number, media_dir):
    camera_number = self.camera.current_mode['option'].number
    sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    sock.connect((ip, port))
    try:
        sock.send(bytes(str(camera_number).ljust(16), 'utf-8'))
        sock.send(bytes(str(media_number).ljust(16), 'utf-8'))
        with open(media_dir + media_name, 'rb') as media:
            sock.sendall(media.read())
    finally:
        sock.send(bytes(str("end").ljust(16), 'utf-8'))
        sock.close()

希望这有助于其他人陷入同样的​​境地!