我正在使用iphone的小应用程序基于用于制作Android应用程序的想法 为了测试,显然我使用模拟器,但模拟器不支持内置摄像头。测试这个的Android想法包括在桌面上使用WebCamBroadcaster Java应用程序从内置网络摄像头捕获帧并通过套接字传递它。然后在应用程序中,您只需读取字节并转换为图像。
我正试图用iPhone模拟器做同样的事情。在Web上搜索,发现一个类可以使用异步套接字(cocoaasyncsocket)。但我不能让它发挥作用。
Java App发送如下框架:
socket = ss.accept();
BufferedImage image = videoCapture.getNextImage();
if (image != null) {
OutputStream out = socket.getOutputStream();
if (RAW) {
image.getWritableTile(0, 0).getDataElements(0, 0, w$
image.releaseWritableTile(0, 0);
DataOutputStream dout = new DataOutputStream(new Bu$
out));
for (int i = 0; i < data.length; i++) {
dout.writeInt(data[i]);
}
dout.close();
} else {
ImageIO.write(image, "JPEG", out);
}
}
这个Android版本使用C代码来实现de socket读取过程,如下所示:
long read_count, total_read = 0;
while (total_read < readBufSize)
{
read_count = read(sockd, &readBuf[total_read], readBufSize);
if (read_count <= 0 || errno != 0)
{
char buffer[100];
sprintf(buffer, "socket read errorno = %d", errno);
LOGV(buffer);
break;
}
total_read += read_count;
}
// If we read all of the data we expected, we will load the frame from the p$
if (total_read == readBufSize){
frame = loadPixels(readBuf, width, height);}
Where readBufsize = width*height*sizeof(int);
readBuf = (char*)malloc(readBufSize);
所以我尝试为iPhone实现相同但我在连接中有错误(errno = 2)..然后我找到cocoaasyncsocket,我尝试使用但我有一个未知的错误,没有任何内容被读取:
#import <Foundation/Foundation.h>
#import "AsyncSocket.h"
@interface Captura : NSObject {
NSString *ipserver;
UInt16 port;
NSError *errPtr;
AsyncSocket *socket;
NSMutableData *socketData;
}
@property (nonatomic,retain) NSString *ipserver;
@property (retain) AsyncSocket *socket;
@property (retain) NSError *errPtr;
//will contain de data read from socket
@property (retain) NSMutableData *socketData;
-(id)initWithIp:(NSString*)ip puerto:(UInt16)p;
-(BOOL)open;
-(void)close;
-(void)beginRead;
- (UIImage*)getImage;
@end
和实施
#import "Captura.h"
@implementation Captura
@synthesize ipserver;
@synthesize socket;
@synthesize errPtr;
@synthesize socketData;
-(id)initWithIp:(NSString*)ip puerto:(UInt16)p{
if (self = [super init]) {
ipserver = ip;
port = p;
socket = [[AsyncSocket alloc] initWithDelegate:self];
socketData = [[NSMutableData alloc] init];
}
return self;
}
//Connect
-(BOOL)open{
return [socket connectToHost:ipserver onPort:port error:&errPtr];
}
-(void)beginRead{
NSLog(@"Begin Read");
NSUInteger offset = [socketData length];
[socket readDataWithTimeout:1
tag:0];
}
- (void)onSocket:(AsyncSocket *)sock didConnectToHost:(NSString *)host port:(UInt16)port{
NSLog(@"Conectado al servidor");
}
- (void)onSocket:(AsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag {
NSLog(@"Data leida %u",[data length]);
[socketData appendData:data];
[self beginRead];
}
- (void)onSocketDidDisconnect:(AsyncSocket *)sock{
[socketData release];
[ipserver release];
[socket release];
NSLog(@"MutableData length %u", [socketData length]);
NSLog(@"Socket Desconectado");
}
- (void)onSocket:(AsyncSocket *)sock willDisconnectWithError:(NSError *)err{
NSLog(@"Ocurrió un error desconectando.... %@",err);
}
- (UIImage*)getImage{
NSData *data;
[socketData getBytes:data length:320*480*sizeof(int)];
NSLog(@"Data obtenida %@",[data length]);
if ([socketData length]>320*480*sizeof(int)) {
[socketData replaceBytesInRange:NSMakeRange(0,320*480*sizeof(int)) withBytes:NULL length:0];
}
if (data!=nil && [data length]) {
UIImage *img = [[UIImage alloc] initWithData:data];
[data release];
return img;
}
[data release];
return nil;
}
@end
这段代码连接到服务器并初始化读取过程然后关闭..套接字断开连接,应用程序关闭。
我无法测试de getImage方法......
有些想法?
提前致谢...
答案 0 :(得分:0)
我认为你需要在-onSocket中调用-beginRead:didConnectToHost:port: