我使用Ambari安装了SCDF,但我似乎无法成功启动Spring Cloud Dataflow服务器。我从日志中获得以下输出:
03:07:30.592 [main] ERROR org.springframework.boot.SpringApplication - Application startup failed
java.lang.IllegalStateException: Failed to load property source from location 'file:/etc/scdf/conf//servers.yml'
at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.loadIntoGroup(ConfigFileApplicationListener.java:465)
at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:432)
at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:371)
at org.springframework.boot.context.config.ConfigFileApplicationListener.addPropertySources(ConfigFileApplicationListener.java:214)
at org.springframework.boot.context.config.ConfigFileApplicationListener.postProcessEnvironment(ConfigFileApplicationListener.java:184)
at org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEnvironmentPreparedEvent(ConfigFileApplicationListener.java:171)
at org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEvent(ConfigFileApplicationListener.java:157)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:167)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:122)
at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:74)
at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:54)
at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:325)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:296)
at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:134)
at org.springframework.cloud.bootstrap.BootstrapApplicationListener.bootstrapServiceContext(BootstrapApplicationListener.java:175)
at org.springframework.cloud.bootstrap.BootstrapApplicationListener.onApplicationEvent(BootstrapApplicationListener.java:98)
at org.springframework.cloud.bootstrap.BootstrapApplicationListener.onApplicationEvent(BootstrapApplicationListener.java:64)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:167)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:122)
at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:74)
at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:54)
at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:325)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:296)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1118)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1107)
at org.springframework.cloud.dataflow.server.yarn.YarnDataFlowServer.main(YarnDataFlowServer.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
Caused by: org.yaml.snakeyaml.parser.ParserException: while parsing a block mapping
in 'reader', line 51, column 5:
fsUri: hdfs://hdp.kgarza.com:8020
^
expected <block end>, but found Scalar
in 'reader', line 57, column 46:
... cationClasspath: {{hadoop_home}}/conf,{{hadoop_home}}/*,{{hadoop ...
^
at org.yaml.snakeyaml.parser.ParserImpl$ParseBlockMappingKey.produce(ParserImpl.java:569)
at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:157)
at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:147)
at org.yaml.snakeyaml.composer.Composer.composeMappingNode(Composer.java:224)
at org.yaml.snakeyaml.composer.Composer.composeNode(Composer.java:155)
at org.yaml.snakeyaml.composer.Composer.composeValueNode(Composer.java:246)
at org.yaml.snakeyaml.composer.Composer.composeMappingChildren(Composer.java:237)
at org.yaml.snakeyaml.composer.Composer.composeMappingNode(Composer.java:225)
at org.yaml.snakeyaml.composer.Composer.composeNode(Composer.java:155)
at org.yaml.snakeyaml.composer.Composer.composeValueNode(Composer.java:246)
at org.yaml.snakeyaml.composer.Composer.composeMappingChildren(Composer.java:237)
at org.yaml.snakeyaml.composer.Composer.composeMappingNode(Composer.java:225)
at org.yaml.snakeyaml.composer.Composer.composeNode(Composer.java:155)
at org.yaml.snakeyaml.composer.Composer.composeDocument(Composer.java:122)
at org.yaml.snakeyaml.composer.Composer.getNode(Composer.java:84)
at org.yaml.snakeyaml.constructor.BaseConstructor.getData(BaseConstructor.java:104)
at org.yaml.snakeyaml.Yaml$1.next(Yaml.java:471)
at org.springframework.beans.factory.config.YamlProcessor.process(YamlProcessor.java:160)
at org.springframework.beans.factory.config.YamlProcessor.process(YamlProcessor.java:138)
at org.springframework.boot.env.YamlPropertySourceLoader$Processor.process(YamlPropertySourceLoader.java:101)
at org.springframework.boot.env.YamlPropertySourceLoader.load(YamlPropertySourceLoader.java:58)
at org.springframework.boot.env.PropertySourcesLoader.load(PropertySourcesLoader.java:127)
at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.doLoadIntoGroup(ConfigFileApplicationListener.java:479)
at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.loadIntoGroup(ConfigFileApplicationListener.java:462)
... 35 common frames omitted
这是我的servers.yml配置:
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
---
spring:
cloud:
stream:
kafka:
binder:
brokers: 192.168.1.43:6667
zkNodes: 192.168.1.43:2181
---
spring:
datasource:
url: jdbc:h2:tcp://hdp.kgarza.com:19092/dataflow
username: sa
password:
driverClassName: org.h2.Driver
---
spring:
cloud:
dataflow:
metrics:
collector:
uri: http://hdp.kgarza.com:18080
stream:
bindings:
applicationMetrics:
destination: metrics
---
dataflow:
uri: http://hdp.kgarza.com:9393
spring:
hadoop:
fsUri: hdfs://hdp.kgarza.com:8020
resourceManagerAddress: hdp.kgarza.com:8050
resourceManagerHost: hdp.kgarza.com
resourceManagerPort: 8050
resourceManagerSchedulerAddress: hdp.kgarza.com:8030
jobHistoryAddress: hdp.kgarza.com:10020
yarnApplicationClasspath: {{hadoop_home}}/conf,{{hadoop_home}}/*,{{hadoop_home}}/lib/*,/usr/hdp/current/hadoop-hdfs-client/*,/usr/hdp/current/hadoop-hdfs-client/lib/*,/usr/hdp/current/hadoop-yarn-client/*,/usr/hdp/current/hadoop-yarn-client/lib/*,/usr/hdp/current/ext/hadoop/*
config:
yarn.resourcemanager.scheduler.address: hdp.kgarza.com:8030
mapreduce.application.framework.path: /hdp/apps/2.6.0.3-8/mapreduce/mapreduce.tar.gz#mr-framework
mapreduce.application.classpath: $PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.0.3-8/hadoop/lib/hadoop-lzo-0.6.0.2.6.0.3-8.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*
---
dataflow.server.port : 9393
deployer.yarn.app.baseDir : '/dataflow'
h2.server.port : 19092
maven.remoteRepositories.springRepo.url : 'https://repo.spring.io/libs-snapshot'
metrics.collector.binder : 'kafka-10'
metrics.collector.channel : 'metrics'
metrics.collector.enabled : true
metrics.collector.server.port : 18080
spring.cloud.deployer.yarn.app.streamappmaster.javaOpts : '-Xms512m -Xmx512m'
spring.cloud.deployer.yarn.app.streamappmaster.memory : 1024
spring.cloud.deployer.yarn.app.streamcontainer.javaOpts : '-Xms512m -Xmx512m'
spring.cloud.deployer.yarn.app.streamcontainer.memory : 1024
spring.cloud.deployer.yarn.app.taskappmaster.javaOpts : '-Xms512m -Xmx512m'
spring.cloud.deployer.yarn.app.taskappmaster.memory : 1024
spring.cloud.deployer.yarn.app.taskcontainer.javaOpts : '-Xms512m -Xmx512m'
spring.cloud.deployer.yarn.app.taskcontainer.memory : 1024
spring.cloud.stream.kafka.binder.brokers : '192.168.1.43:6667'
spring.cloud.stream.kafka.binder.zkNodes : '192.168.1.43:2181'
spring.rabbitmq.password : 'guest'
spring.rabbitmq.username : 'guest'
根据日志,似乎问题与第51行的servers.yml中的配置有关,但我看不出它是什么错误。
编辑: 经过进一步审查,似乎该行中的servers.yml存在错误 yarnApplicationClasspath:{{hadoop_home}} / conf,{{hadoop_home}} / ,{{hadoop_home}} / lib / ,/ usr / hdp / current / hadoop-hdfs-client / ,的/ usr / HDP /电流/ Hadoop的HDFS的客户端/ LIB / ,在/ usr / HDP /电流/ Hadoop的纱线的客户端/ ,在/ usr / HDP /电流/ Hadoop的纱线的客户机/ LIB / ,在/ usr / HDP /电流/ EXT / hadoop的/ *
我编辑了该文件以替换{{hadoop_home}}变量,但似乎每次从ambari重新启动scdf服务器时它都会被覆盖。
答案 0 :(得分:0)
确实yarnApplicationClasspath: {{hadoop_home}}
看起来错了。
来源为yarnApplicationClasspath: {{yarn_app_classpath}}
,价值来自yarn_app_classpath = config['configurations']['yarn-site']['yarn.application.classpath']
。
你可以在ambari的纱线服务下查看yarn-site
配置中的实际内容吗?
即我有类似
的东西<property>
<name>yarn.application.classpath</name>
<value>$HADOOP_CONF_DIR,/usr/hdp/current/hadoop-client/*,/usr/hdp/current/hadoop-client/lib/*,/usr/hdp/current/hadoop-hdfs-client/*,/usr/hdp/current/hadoop-hdfs-client/lib/*,/usr/hdp/current/hadoop-yarn-client/*,/usr/hdp/current/hadoop-yarn-client/lib/*</value>
</property>