启动Spark-Shell时出现许多错误

时间:2017-07-18 20:30:59

标签: scala apache-spark

我使用" brew install apache-spark"下载了spark。当我启动spark-shell时,我遇到了大量的错误。当我尝试创建一个火花会话时:

var serverPort = 8080;

我收到以下错误:

val spark = SparkSession.builder().appName("Spark Postgresql Example").getOrCreate()

以及更多..

Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

Nested Throwables StackTrace:
java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

17/07/18 13:12:35 WARN HiveMetaStore: Retrying creating default database after error: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

17/07/18 13:12:35 ERROR Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73a116d, see the next exception for details.

1 个答案:

答案 0 :(得分:1)

当spark-shell没有正常退出时会出现此错误,然后新会话调用spark-shell.try重新启动spark-shell

如果仍在发生,您可以尝试创建会话

var sparkSession = org.apache.spark.sql.SparkSessionbuilder.getOrCreate
var sparkContext = sparkSession.sparkContext

您可以尝试删除metastore_db / dbex.lck这将解决您的问题

您还可以在{SPARK_HOME} / conf中配置hive-site.xml,上下文会自动在当前目录中创建一个名为metastore_db的Metastore和一个名为warehouse的文件夹。修复您要启动的目录中的权限问题spark-shell可以解决你的问题