好的所以我是我的应用程序我有一个ViewController
来处理从摄像机录制的视频,然后将其保存到我的应用程序文件夹的文档目录中。现在我要做的是同时录制视频同时将当前正在写入文件的部分上传到服务器(我是新手,但猜测http服务器)。我这样做的原因是因为我希望添加支持,所以当视频我可以流式传输到chrome cast时。这是可能的,因为EZCast应用已经执行了类似的功能。
我已经研究过如何将视频上传到http服务器,将http服务器中的视频发送到chrome chromecast并使用这些来源实际录制视频:
Chrome Cast:https://developers.google.com/cast/
Chrome Cast:https://github.com/googlecast/CastVideos-ios
Http Server:https://github.com/robbiehanson/CocoaHTTPServer
从Idevice相机录制:https://github.com/BradLarson/GPUImage
要投射视频,我显然已连接但在允许我进入录制视图之前,我必须已连接,因此我的代码纯粹投射.mp4视频简单如下:
-(void)startCasting
{
[self establishServer];
self.mediaControlChannel = [[GCKMediaControlChannel alloc] init];
self.mediaControlChannel.delegate = self;
[self.deviceManager addChannel:gblvb.mediaControlChannel];
[self.mediaControlChannel requestStatus];
NSString *path = [NSString stringWithFormat:@"http://%@%@%hu%@%@", [self getIPAddress], @":" ,[httpServer listeningPort], @"/", @"Movie.mp4"];
NSString *image;
NSString *type;
self.metadata = [[GCKMediaMetadata alloc] init];
image = @"";//Image HERE
[gblvb.metadata setString:@"
forKey:kGCKMetadataKeySubtitle];//Description Here
type = @"video/mp4";//Video Type
[self.metadata setString:[NSString stringWithFormat:@"%@%@", @"Casting " , @"Movie.mp4"]forKey:kGCKMetadataKeyTitle];//Title HERE
//define Media information
GCKMediaInformation *mediaInformation =
[[GCKMediaInformation alloc] initWithContentID:path
streamType:GCKMediaStreamTypeNone
contentType:type
metadata:gblvb.metadata
streamDuration:0
customData:nil];
//cast video
[self.mediaControlChannel loadMedia:mediaInformation autoplay:TRUE playPosition:0];
}
- (NSString *)getIPAddress {
NSString *address = @"error";
struct ifaddrs *interfaces = NULL;
struct ifaddrs *temp_addr = NULL;
int success = 0;
// retrieve the current interfaces - returns 0 on success
success = getifaddrs(&interfaces);
if (success == 0) {
// Loop through linked list of interfaces
temp_addr = interfaces;
while(temp_addr != NULL) {
if(temp_addr->ifa_addr->sa_family == AF_INET) {
// Check if interface is en0 which is the wifi connection on the iPhone
if([[NSString stringWithUTF8String:temp_addr->ifa_name] isEqualToString:@"en0"]) {
// Get NSString from C String
address = [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)temp_addr->ifa_addr)->sin_addr)];
}
}
temp_addr = temp_addr->ifa_next;
}
}
// Free memory
freeifaddrs(interfaces);
return address;
}
现在在投射之前我需要建立我的http服务器。这很简单,在将CocoaHTTPServer添加到项目后需要很少的实现。我启动服务器的代码如下所示:
static const int ddLogLevel = LOG_LEVEL_VERBOSE;
-(void)establishServer
{
[httpServer stop];
// Do any additional setup after loading the view from its nib.
// Configure our logging framework.
// To keep things simple and fast, we're just going to log to the Xcode console.
[DDLog addLogger:[DDTTYLogger sharedInstance]];
// Create server using our custom MyHTTPServer class
httpServer = [[HTTPServer alloc] init];
// Tell the server to broadcast its presence via Bonjour.
// This allows browsers such as Safari to automatically discover our service.
[httpServer setType:@"_http._tcp."];
// Normally there's no need to run our server on any specific port.
// Technologies like Bonjour allow clients to dynamically discover the server's port at runtime.
// However, for easy testing you may want force a certain port so you can just hit the refresh button.
// [httpServer setPort:12345];
// Serve files from our embedded Web folder
NSString *webPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/"];
DDLogInfo(@"Setting document root: %@", webPath);
[httpServer setDocumentRoot:webPath];
[self startServer];
}
- (void)startServer
{
// Start the server (and check for problems)
NSError *error;
if([httpServer start:&error])
{
DDLogInfo(@"Started HTTP Server on port %hu", [httpServer listeningPort]);
}
else
{
DDLogError(@"Error starting HTTP Server: %@", error);
}
}
最后,我使用此代码从iPhone相机开始显示和录制:
- (void)viewDidLoad
{
[super viewDidLoad];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1920x1080 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = NO;
videoCamera.horizontallyMirrorRearFacingCamera = NO;
// filter = [[GPUImageSepiaFilter alloc] init];
// filter = [[GPUImageTiltShiftFilter alloc] init];
// [(GPUImageTiltShiftFilter *)filter setTopFocusLevel:0.65];
// [(GPUImageTiltShiftFilter *)filter setBottomFocusLevel:0.85];
// [(GPUImageTiltShiftFilter *)filter setBlurSize:1.5];
// [(GPUImageTiltShiftFilter *)filter setFocusFallOffRate:0.2];
// filter = [[GPUImageSketchFilter alloc] init];
filter = [[GPUImageFilter alloc] init];
// filter = [[GPUImageSmoothToonFilter alloc] init];
// GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRightFlipVertical];
[videoCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
// filterView.fillMode = kGPUImageFillModeStretch;
// filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
// Record a movie for 10 s and store it in /Documents, visible via iTunes file sharing
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.mp4"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
movieWriter.encodingLiveVideo = YES;
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720.0, 1280.0)];
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(1080.0, 1920.0)];
[filter addTarget:movieWriter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
}
bool recording;
- (IBAction)Record:(id)sender
{
if (recording == YES)
{
Record.titleLabel.text = @"Record";
recording = NO;
double delayInSeconds = 0.1;
dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){
[filter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSLog(@"Movie completed");
// [videoCamera.inputCamera lockForConfiguration:nil];
// [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOff];
// [videoCamera.inputCamera unlockForConfiguration];
});
UIAlertView *message = [[UIAlertView alloc] initWithTitle:@"Do You Wish To Store This Footage?"
message:@"Recording has fineshed. Do you wish to store this video into your camera roll?"
delegate:self
cancelButtonTitle:nil
otherButtonTitles:@"Yes", @"No",nil];
[message show];
[self dismissViewControllerAnimated:YES completion:nil];
}
else
{
double delayToStartRecording = 0.5;
dispatch_time_t startTime = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC);
dispatch_after(startTime, dispatch_get_main_queue(), ^(void){
NSLog(@"Start recording");
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
// NSError *error = nil;
// if (![videoCamera.inputCamera lockForConfiguration:&error])
// {
// NSLog(@"Error locking for configuration: %@", error);
// }
// [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOn];
// [videoCamera.inputCamera unlockForConfiguration];
recording = YES;
Record.titleLabel.text = @"Stop";
});
[self startCasting];
}
}
现在您可能会看到我正在尝试在录制后直接录制视频并将服务器指向该位置。这是行不通的,因为我认为文件并非正式位于该路径,直到按下停止按钮,但我该如何解决这个问题?有人可以帮忙吗?
ChromeCast支持的媒体类型 https://developers.google.com/cast/docs/media