通过Shell脚本运行Java程序时遇到一些问题

时间:2012-04-18 08:57:39

标签: java shell

我已经为自动编写了一个shell脚本 1)启动hadoop服务(namenode,datanode,jobtracker,tasktracker,secondary namenode), 2)从蜂巢中删除所有表 3)再次从SQL SERVER

导入配置单元中的所有表

我从java调用这个shel脚本。以下是Shell脚本和Java代码的代码

Shell脚本:

export HADOOP_HOME=/home/hadoop/hadoop-0.20.2-cdh3u2/
export HIVE_HOME=/home/hadoop/hive-0.7.1/
export SQOOP_HOME=/home/hadoop/sqoop-1.3.0-cdh3u1/
export MSSQL_CONNECTOR_HOME=/home/hadoop/sqoop-sqlserver-1.0
export HBASE_HOME=/home/hadoop/hbase-0.90.1-cdh3u0
export ZOOKEEPER_HOME=/home/hadoop/zookeeper-3.3.1+10
export SQOOP_CONF_DIR=/home/hadoop/sqoop-1.3.0-cdh3u1/conf/

/home/hadoop/hadoop-0.20.2-cdh3u2/bin/hadoop/start-all.sh
/home/hadoop/hadoop-0.20.2-cdh3u2/bin/hadoop -rmr /user/hadoop/*

/home/hadoop/hive-0.7.1/bin/hive -e 'show tables' > TablesToDelete.txt
while read line1
do
    echo 'drop table '$line1
    /home/hadoop/hive-0.7.1/bin/hive -e 'drop table '$line1
done < TablesToDelete.txt

while read line
do
    echo $line" ------------------------------"
/home/hadoop/sqoop-1.3.0-cdh3u1/bin/sqoop-import --connect 'jdbc:sqlserver://192.168.1.1;username=abcd;password=12345;database=HadoopTest' --table line --hive-table $line  --create-hive-table --hive-import -m 1 --hive-drop-import-delims --hive-home /home/hadoop/hive-0.7.1 --verbose
done < /home/hadoop/sqoop-1.3.0-cdh3u1/bin/tables.txt

Java代码:

public class ImportTables
{

    public static void main(String arsg[])
    {
        PrintWriter pw=null;
        try
        {
            Formatter formatter = new Formatter();
            String LogFile = "Log-"+ formatter.format("%1$tm%1$td-%1$tH%1$tM%1$tS", new Date());   
            File f=new File("/home/hadoop/"+LogFile);
            FileWriter fw1=null;   
            pw=new PrintWriter(f);

            String cmd = "/home/hadoop/sqoop-1.3.0-cdh3u1/bin/TablesToImport.sh"; // this is the command to execute in the Unix shell

            // create a process for the shell
            ProcessBuilder pb = new ProcessBuilder("bash", "-c", cmd);
            pb.redirectErrorStream(true); // use this to capture messages sent to stderr
            Process shell = pb.start();
            InputStream shellIn = shell.getInputStream(); // this captures the output from the command
            int shellExitStatus = shell.waitFor();
            // wait for the shell to finish and get the return code
            // at this point you can process the output issued by the command

            // for instance, this reads the output and writes it to System.out:
            int c;
            while ((c = shellIn.read()) != -1)
            {
                System.out.write(c);
            }

            // close the stream
            shellIn.close();
        }
        catch(Exception e)
        {
            e.printStackTrace();
            e.printStackTrace(pw);
            pw.flush();
            System.exit(1);
        }



    }
}

但是当我运行该程序时,我看到控制台上没有,程序仍处于运行模式。 如果我把以下代码离子shell脚本:

/home/hadoop/hive-0.7.1/bin/hive -e 'show tables' > TablesToDelete.txt
while read line1
do
    echo 'drop table '$line1
    /home/hadoop/hive-0.7.1/bin/hive -e 'drop table '$line1
done < TablesToDelete.txt

然后输出如下:

  

找不到hadoop安装:必须设置$ HADOOP_HOME或hadoop必须在路径中

我的程序/脚本有什么问题?在哪里以及如何在我的脚本中设置HADOOP_HOME和所有路径?

1 个答案:

答案 0 :(得分:1)

waitFor的呼叫是阻塞呼叫,顾名思义。它会停止进一步执行,直到该过程完成。但是因为你的代码也是进程stdout的接收器,所以整个事情都会阻塞。在处理完脚本输出后,只需将waitFor移动到