无法找到java

时间:2018-02-21 17:53:43

标签: java

我无法检查我的家。

  • /etc/environment文件为空。
  • 没有提到 .bashrc
  • rpm -qa | grep java不打印任何内容。
  • java -version说“找不到Java命令”。

但是,java正在机器上运行: -

  

00:00:38 /usr/jdk64/jdk1.8.0_112/bin/java -Dproc_datanode -Xmx1024m -Dhdp.version = 2.6.1.0-129 -Djava.net.preferIPv4Stack = true -Dhdp.version = -java .net.preferIPv4Stack = true -Dhdp.version = -Djava.net.preferIPv4Stack = true -Dhadoop.log.dir = / var / log / hadoop / hdfs -Dhadoop.log.file = hadoop.log -Dhadoop.home.dir = / usr / hdp / 2.6.1.0-129 / hadoop -Dhadoop.id.str = hdfs -Dhadoop.root.logger = INFO,console -Djava.library.path =:/ usr / hdp / 2.6.1.0-129 / hadoop / lib / native / Linux-amd64-64:/ usr / hdp / current / hadoop-client / lib / native / Linux-amd64-64:/usr/hdp/2.6.1.0-129/hadoop/lib/native- Dhadoop.policy.file = hadoop-policy.xml -Djava.net.preferIPv4Stack = true -Dhdp.version = 2.6.1.0-129 -Dhadoop.log.dir = / var / log / hadoop / hdfs -Dhadoop.log.file = hadoop-hdfs-datanode-asokpronode1.openstacklocal.log -Dhadoop.home.dir = / usr / hdp / 2.6.1.0-129 / hadoop -Dhadoop.id.str = hdfs -Dhadoop.root.logger = INFO,RFA - Djava.library.path =:在/ usr / HDP / 2.6.1.0-129 / hadoop的/ LIB /天然的/ Linux的amd64-64:在/ usr / HDP /电流/ Hadoop的客户机/ LIB /天然的/ Linux的amd64-64 :在/ usr /hdp/2.6.1.0-129/hadoop/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.6.1.0-129/hadoop/lib /native/Linux-amd64-64:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.6.1.0-129/hadoop/lib/native -Dhadoop.policy .file = hadoop-policy.xml -Djava.net.preferIPv4Stack = true -server -server -XX:ParallelGCThreads = 4 -XX:+ UseConcMarkSweepGC -XX:ErrorFile = / var / log / hadoop / hdfs / hs_err_pid%p.log -XX:NewSize = 200m -XX:MaxNewSize = 200m -Xloggc:/var/log/hadoop/hdfs/gc.log-201802211556 -verbose:gc -XX:+ PrintGCDetails -XX:+ PrintGCTimeStamps -XX:+ PrintGCDateStamps -Xms1024m -Xmx1024m -Dhadoop.security.logger = INFO,DRFAS -Dhdfs.audit.logger = INFO,DRFAAUDIT -server -XX:ParallelGCThreads = 4 -XX:+ UseConcMarkSweepGC -XX:ErrorFile = / var / log / hadoop / hdfs / hs_err_pid %p.log -XX:NewSize = 200m -XX:MaxNewSize = 200m -Xloggc:/var/log/hadoop/hdfs/gc.log-201802211556 -verbose:gc -XX:+ PrintGCDetails -XX:+ PrintGCTimeStamps -XX: + PrintGCDateStamps -Xms1024m -Xmx1024m -Dhadoop.security.logger = INFO,DRFAS -Dhdfs.audit.logger = INFO,DRFAAUDIT -server -XX:ParallelGCThreads = 4 -XX:+ UseConcMarkSweepGC -XX:ErrorFile = / var / log / hadoop / hdfs / hs_err_pid%p.log -XX:NewSize = 200m -XX:MaxNewSize = 200m -Xloggc:/var/log/hadoop/hdfs/gc.log-201802211556 -verbose:gc -XX:+ PrintGCDetails -XX:+ PrintGCTimeStamps -XX:+ PrintGCDateStamps -Xms1024m -Xmx1024m -Dhadoop .security.logger = INFO,DRFAS -Dhdfs.audit.logger = INFO,DRFAAUDIT -XX:CMSInitiatingOccupancyFraction = 70 -XX:+ UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction = 70 -XX:+ UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction = 70 -XX:+ UseCMSInitiatingOccupancyOnly -Dhadoop.security.logger = INFO,RFAS org.apache.hadoop.hdfs.server.datanode.DataNode

0 个答案:

没有答案