在linux中运行sparkR文件

时间:2017-11-17 08:51:53

标签: apache-spark

从java我调用shellscript,通过shellscript我在调用sparkR文件(test.R)时运行Test.R我收到以下错误:

  

read.df出错(sqlContext,source =   " com.mongodb.spark.sql.DefaultSource"):

  1. Test.R包含以下代码

    .libPaths(c("/opt/app/workload/deployments/servers/R-3.4.2/lib",.libPaths())) 
    
    sw <- read.df(sqlContext, source = "com.mongodb.spark.sql.DefaultSource")
    
  2. shellscript

    sparkR filename
    
  3. java代码

    public class TestShell 
    {
        public static void main(String[] args) throws IOException{
            String s=null;
            String[] cmdScript = new String[]{ "/bin/sh","/opt/app/workload/deployments/packages/POC/training.sh"}; 
            Process procScript = Runtime.getRuntime().exec(cmdScript);
            BufferedReader stdInput = new BufferedReader(new 
                InputStreamReader(procScript.getInputStream()));
            System.out.println("Here is the standard output of the command:\n");
            while ((s = stdInput.readLine()) != null) {
                System.out.println(s);
            }
        }
    }
    

0 个答案:

没有答案