如何通过TCP将IplImage从服务器发送到iPod客户端UIImage

时间:2011-07-01 10:23:27

标签: ios sockets uiimage istream iplimage

我使用Berkeley_sockets在Linux中有一台服务器,我与iPod客户端建立TCP连接。我有一个IplImage* img;从服务器发送到iPod。我使用write(socket,/*DATA*/,43200);命令,我尝试发送的数据是:reinterpret_cast<char*>(img)imgimg->imageData。所有这些选择实际上都会发送任何类型的数据。

在iPod方面,我以这种方式接收数据(正如我在SO中看到的那样。不要介意复杂的东西,它只是用于从单个图像接收所有数据。):

bytesRead = [iStream read: (char*)[buffer mutableBytes] + totalBytesRead maxLength: 43200 - totalBytesRead];

收到整张图片后,我有了这个:

[buffer setLength: 43200];
NSData *imagem = [NSData dataWithBytes:buffer length:43200];
UIImage *final= [self UIImageFromIplImage:imagem];

现在..我知道我可以让openCV在iPod上工作,但是我找不到关于如何让它工作的简单解释,所以我使用the second code from this webpage并对其进行了调整,因为我知道所有我的图像的规格(例如我设置了CGImageCreate()函数中的所有变量。):

- (UIImage *)UIImageFromIplImage:(NSData *)image {

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();

// Allocating the buffer for CGImage
NSData *data = [NSData dataWithBytes:image length:43200];

CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);

// Creating CGImage from chunk of IplImage    
size_t width = 240;
size_t height = 180;
size_t depth = 8;             //bitsPerComponent
size_t depthXnChannels = 8;   //bitsPerPixel
size_t widthStep = 240;       //bytesPerRow

CGImageRef imageRef = CGImageCreate(width, height, depth, depthXnChannels, widthStep, colorSpace, kCGImageAlphaNone|kCGBitmapByteOrderDefault,provider, NULL, false, kCGRenderingIntentDefault);

// Getting UIImage from CGImage
UIImage *ret = [UIImage imageWithCGImage:imageRef];
lolView.image = ret;
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return ret;

}

问题:当我显示图像时,即使发送的图像始终相同,我也会感到非常奇怪和“随机”。我真的不知道出了什么问题......

PS:TCP连接与其他数据(如数字或单词)一起正常工作。图像是灰度图像。

感谢您的帮助。

1 个答案:

答案 0 :(得分:1)

我让它像这样工作。 在服务器端(linux中的code :: blocks with openframeworks(&amp; ofxOpenCv)):

img.allocate(240, 180, OF_IMAGE_COLOR);                    //ofImage
img2.allocate(240, 180);                                   //ofxCvColorImage
frame = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 3);   //IplImage
bw = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 1);      //IplImage
gray.allocate(240, 180);                                   //ofxCvGrayscaleImage


///ofImage
img.loadImage("lol.jpg");

///ofImage -> ofxCvColor
img2.setFromPixels(img.getPixels(), 240, 180);

///ofxCvColor -> IplImage
frame = img2.getCvImage();

///IplImage in GRAY
cvCvtColor(frame,bw,CV_RGB2GRAY);
cvThreshold(bw,bw,200,255,CV_THRESH_BINARY);  //It is actually a binary image
gray = bw;
pix = gray.getPixels();

n=write(newsockfd,pix,43200);

在客户端(iPod 4.3):

-(UIImage *) dataFromIplImageToUIImage:(unsigned char *) rawData;
{
size_t width = 240;
size_t height = 180;
size_t depth = 8;                   //bitsPerComponent
size_t depthXnChannels = 8;         //bitsPerPixel
size_t widthStep = 240;             //bytesPerRow

CGContextRef ctx = CGBitmapContextCreate(rawData, width, height, depth, widthStep,  CGColorSpaceCreateDeviceGray(), kCGImageAlphaNone);

CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];  

CGContextRelease(ctx);  

myImageView.image = rawImage;  
return rawImage;

free(rawData);
}

可能有一种更简单的方法可以做到这一点,但是,嘿,完成工作。希望这对任何人都有帮助。