我在我的Java项目中使用Apache Spark,程序的结果只在我运行我的程序时发生,现在我希望我的程序始终使用SparkStreaming启动并运行。
我的项目结构如下:
Package.launch:
public class App {
public App(){
new Launch();
}
public static void main( String[] args )
{
new App();
}
}
public class Launch {
Read read = new Read();
Transform transform = new Transform();
Write write = new Write();
public Launch(){
write.getWriter(
transform.getTransformer(
read.getReader()));
}
}
Package.read:
public class Read {
public Dataset getReader(){
// Read from csv file then return dataset
return ds;
}
}
Package.transform:
public class Transform {
public Dataset getTransformer(Dataset ds){
//do transfomration on Dataset ds then return the final Dataset
return ds;
}
}
Package.write:
public class Write {
public void getWriter(Dataset ds){
// write the result on csv file
}
}
答案 0 :(得分:-1)
让Spark streaming
职位继续在后台运行的最佳方式。
Use "nohup spark-submit <parameters> 2>&1 < /dev/null &"
如果您想从shell
运行Java
java -jar {PATH TO JARFILE} $1 $2
替换
{PATH TO JARFILE}
带有jar文件的路径