火花工人没有连接到主人

时间:2016-12-09 09:15:46

标签: apache-spark

我想创建一个spark独立群集。我能够在同一节点上运行主服务器和从服务器,但不同节点上的从服务器既不显示主URL也不连接到主服务器。

我正在运行命令:

start-slave.sh spark://spark-server:7077

其中spark-server是我的主人的主机名。

我能够从worker中ping master,但是master的WebUI不显示除了在同一台机器上运行的任何worker。客户机节点正在运行一个worker,但它是独立的,没有连接到master。

3 个答案:

答案 0 :(得分:9)

请检查主节点上的配置文件“ spark-env.sh ”。您是否已将 SPARK_MASTER_HOST 变量设置为主节点的IP地址?如果不尝试设置它并重新启动主站和从站。例如,如果主节点的IP为192.168.0.1,那么您应该 SPARK_MASTER_HOST = 192.168.0.1 。请注意,您不需要在从属设置上设置此变量。

答案 1 :(得分:3)

1)确保在节点之间设置密码减去SSH

请参阅以下链接,在节点之间设置密码减去ssh

http://www.tecmint.com/ssh-passwordless-login-using-ssh-keygen-in-5-easy-steps/

2)在$ SPARK_HOME / conf目录中的slave文件中指定从站IP地址

[这是主节点上包含conf目录的spark文件夹]

3)在从属文件中指定IP地址后,启动spark cluster

主节点上的

[执行$ SPARK_HOME / sbin目录中的start-all.sh脚本]

希望这有助于

答案 2 :(得分:2)

如果您能够从Worker ping主节点意味着它具有网络连接。需要在Spark master中添加新的工作节点,您需要更新一些内容 #import <AVFoundation/AVFoundation.h> -(void)MixVideoWithText { AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:nil]; AVMutableComposition* mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; //If you need audio as well add the Asset Track for audio here [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil]; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil]; [compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]]; CGSize sizeOfVideo=[videoAsset naturalSize]; //TextLayer defines the text they want to add in Video //Text of watermark CATextLayer *textOfvideo=[[CATextLayer alloc] init]; textOfvideo.string=[NSString stringWithFormat:@"%@",text];//text is shows the text that you want add in video. [textOfvideo setFont:(__bridge CFTypeRef)([UIFont fontWithName:[NSString stringWithFormat:@"%@",fontUsed] size:13])];//fontUsed is the name of font [textOfvideo setFrame:CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height/6)]; [textOfvideo setAlignmentMode:kCAAlignmentCenter]; [textOfvideo setForegroundColor:[selectedColour CGColor]]; //Image of watermark UIImage *myImage=[UIImage imageNamed:@"one.png"]; CALayer layerCa = [CALayer layer]; layerCa.contents = (id)myImage.CGImage; layerCa.frame = CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); layerCa.opacity = 1.0; CALayer *optionalLayer=[CALayer layer]; [optionalL addSublayer:textOfvideo]; optionalL.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); [optionalL setMasksToBounds:YES]; CALayer *parentLayer=[CALayer layer]; CALayer *videoLayer=[CALayer layer]; parentLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); videoLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); [parentLayer addSublayer:videoLayer]; [parentLayer addSublayer:optionalLayer]; [parentLayer addSublayer:layerCa]; AVMutableVideoComposition *videoComposition=[AVMutableVideoComposition videoComposition] ; videoComposition.frameDuration=CMTimeMake(1, 30); videoComposition.renderSize=sizeOfVideo; videoComposition.animationTool=[AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]); AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0]; NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init]; [dateFormatter setDateFormat:@"yyyy-MM-dd_HH-mm-ss"]; NSString *destinationPath = [documentsDirectory stringByAppendingFormat:@"/utput_%@.mov", [dateFormatter stringFromDate:[NSDate date]]]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality]; exportSession.videoComposition=videoComposition; exportSession.outputURL = [NSURL fileURLWithPath:destinationPath]; exportSession.outputFileType = AVFileTypeQuickTimeMovie; [exportSession exportAsynchronouslyWithCompletionHandler:^{ switch (exportSession.status) { case AVAssetExportSessionStatusCompleted: NSLog(@"Export OK"); if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(destinationPath)) { UISaveVideoAtPathToSavedPhotosAlbum(destinationPath, self, @selector(video:didFinishSavingWithError:contextInfo:), nil); } break; case AVAssetExportSessionStatusFailed: NSLog (@"AVAssetExportSessionStatusFailed: %@", exportSession.error); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export Cancelled"); break; } }]; } Shows the error they will come after saving video. -(void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo { if(error) NSLog(@"Finished saving video with error: %@", error); } 请查看官方文件Spark CLuster launch 并更新所需的文件。

这是另一个可以帮助您Spark Cluster modeBlog

的博客