当我尝试使用sparkR时,我正在努力解决这个问题。
sparkR.session(master = "local[*]", sparkConfig = list(spark.driver.memory = "1g"))
Error in handleErrors(returnStatus, conn) :
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.api.r.SQLUtils$$anonfun$setSparkContextSessionConf$2.apply(SQLUtils.scala:67)
at org.apache.spark.sql.api.r.SQLUtils$$anonfun$setSparkContextSessionConf$2.apply(SQLUtils.scala:66)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.Traversabl
希望这是一个明确的解决方案,我是新手,我对Java scala一无所知。非常感谢!
答案 0 :(得分:0)
我有同样的错误。看起来它与用户权限有关。因此,您有两种选择:
1)从您拥有必要权限的目录中启动sparkR(先决条件:需要在PATH环境变量中包含spark bin文件夹:export PATH=$SPARK_HOME/bin:$PATH
):
cd ~
sparkR
2)使用sudo权限启动sparkR:
/opt/spark/bin $ sudo ./sparkR
答案 1 :(得分:0)
请尝试从环境变量中删除btn_Employee.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View vEmployee) {
if(btn_Employee.isPressed())
{
//markAttendance and generateReport button should be disabled
Intent i = new Intent(Selection.this, HomeActivity.class);
i.putExtra("BUTTON_CLICK","EMPLOYEE");
startActivity(i);
}
}
});
btn_Employer.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View vEmployer) {
Intent i = new Intent(Selection.this, HomeActivity.class);
i.putExtra("BUTTON_CLICK","EMPLOYER");
startActivity(i);
}
});