testdb=# CREATE EXTERNAL TABLE sales_fact_1997 ( product_id int, time_id int, customer_id int, promotion_id int, store_id int, store_sales decimal, store_cost decimal, unit_sales decimal ) LOCATION ('gphdfs://hz-cluster2/user/nrpt/hive-server/foodmart.db/sales_fact_1997') FORMAT 'TEXT' (DELIMITER ',');
CREATE EXTERNAL TABLE
testdb=#
testdb=#
testdb=#
testdb=# select * from sales_fact_1997 ;
ERROR: external table gphdfs protocol command ended with error. Error occurred during initialization of VM (seg0 slice1 sdw1:40000 pid=3450)
DETAIL:
Could not reserve enough space for object heap
Could not create the Java virtual machine.
Command: 'gphdfs://le/user/nrpt/hive-server/foodmart.db/sales_fact_1997'
External table sales_fact_1997, file gphdfs://hz-cluster2/user/nrpt/hive-server/foodmart.db/sales_fact_1997
我从hadoop-2.5.2 / etc / hadoop / hadoop-env.sh文件中更改了-Xmx的值,我看到可用的内存足够JVM。但我仍然得到这个错误。 如下
@localhost ~]$ free -m
export GP_JAVA_OPT='-Xms20m -Xmx20m -XX:+DisplayVMOutputToStderr'
total used free shared buff/cache available
Mem: 993 114 393 219 485 518
Swap: 819 0 819
谁可以帮助我,我创建了EXTERNAL TABLE成功,但我无法从hdfs读取数据。
答案 0 :(得分:0)
您的段主机上没有足够的内存,或者您需要为JVM分配更多内存。以下是如何为gphdfs配置JVM: http://gpdb.docs.pivotal.io/4380/admin_guide/load/topics/g-gphdfs-jvm.html