我正在尝试将表格导出到maria db;但是大多数表工作正常,但是有一个包含大约120列的表,这会导致缓冲区溢出错误。
<!doctype html>
<html>
<head>
<title>Aurelia</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body aurelia-app>
<table>
<tr>
<td>
elem 1
</td>
<td> elem 2</td>
<td> elem 3</td>
<td></td>
</tr>
<tr>
<td></td>
<td></td>
<td></td>
<td>
<button type="button" class="btn btn-info">Add</button>
</td>
</tr>
</table>
<script src="https://jdanyow.github.io/rjs-bundle/node_modules/requirejs/require.js"></script>
<script src="https://jdanyow.github.io/rjs-bundle/config.js"></script>
<script src="https://jdanyow.github.io/rjs-bundle/bundles/aurelia.js"></script>
<script src="https://jdanyow.github.io/rjs-bundle/bundles/babel.js"></script>
<script>
require(['aurelia-bootstrapper']);
</script>
</body>
</html>
我在aws emr上运行sqoop。
我正在运行的Sqoop命令是:
2017-05-11 13:33:36,674 INFO [main] org.apache.hive.hcatalog.mapreduce.InternalUtil: Initializing org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe with properties {name=default.temp_searched_sp, numFiles=4, columns.types=bigint,varchar(254),timestamp,timestamp,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,string,bigint,string,timestamp,timestamp,string,string,string,double,string,bigint,string,string,timestamp,timestamp,string,string,timestamp,timestamp,bigint,bigint,string,string,string,boolean,string,string,timestamp,timestamp,string, serialization.format=1, columns=sequence_id,id,est_received_at,received_at,advertisers_on_page,context_accept,context_accept_charset,context_accept_encoding,context_accept_language,context_akamai_origin_hop,context_akamai_reputation,context_alexatoolbar_alx_ns_ph,context_authorization,context_cache_control,context_cdma1989,context_client_ip,context_connection,context_content_length,context_content_type,context_cookie,context_d_token,context_dnt,context_el_auth_param,context_fooheader,context_gateway_ip,context_giga_transport,context_host,context_iorad_extension,context_iv_user,context_library_name,context_library_version,context_myrefer,context_oppo_request_type,context_orig_host,context_origin,context_pragma,context_prefer,context_q_token,context_ra_sid,context_ra_ver,context_referer,context_s_token,context_save_data,context_sm_user,context_surrogate_capability,context_transaction_id,context_true_client_ip,context_ua_cpu,context_up_recursive_request,context_user_agent,context_usertoken,context_via,context_wk_utd_ip,context_ws_grp,context_x_akamai_config_log_detail,context_x_angi_applicationversion,context_x_angi_featureflags,context_x_angi_proxyversion,context_x_angi_requestid,context_x_angi_sourceapplication,context_x_att_deviceid,context_x_bluecoat_via,context_x_browser_session,context_x_clickoncesupport,context_x_client_id,context_x_csix_custid,context_x_csix_custkey,context_x_cw_pageurl,context_x_elastica_gw,context_x_forwarded_for,context_x_forwarded_port,context_x_forwarded_proto,context_x_icm,context_x_imforwards,context_x_int,context_x_iws_via,context_x_mwg_via,context_x_newrelic_id,context_x_paas_uid,context_x_personasinteractive_addon,context_x_psa_client_features,context_x_psa_client_options,context_x_real_ip,context_x_request_id,context_x_requested_with,context_x_rl_forwarded_for,context_x_sfs_embed,context_x_sharepath_rum_enabled,context_x_target_proxy,context_x_wap_profile,context_xroxy_connection,event,event_text,location_info_advertising_zone,location_info_search_zip_code,est_original_timestamp,original_timestamp,results,search_for,search_params_filters_categories,search_params_filters_distance_from_provider,search_params_filters_first_name,search_params_page,search_params_query,search_params_type,est_sent_at,sent_at,sort_sort_by,sort_sort_field,est_e_timestamp,e_timestamp,total_pages,total_results,user_id,query,search_type,test_user,category_id,edh_raw_file_name,est_load_timestamp,utc_load_timestamp,edh_bus_month, rawDataSize=112904662, columns.comments=nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull, numRows=155793, serialization.lib=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, COLUMN_STATS_ACCURATE={"BASIC_STATS":"true"}, totalSize=113060455, serialization.null.format=\N, transient_lastDdlTime=1494509566}
2017-05-11 13:33:37,971 ERROR [Thread-13] org.apache.hadoop.yarn.YarnUncaughtExceptionHandler: Thread Thread[Thread-13,5,main] threw an Exception.
java.nio.BufferOverflowException
at java.nio.HeapByteBuffer.put(HeapByteBuffer.java:189)
at java.nio.ByteBuffer.put(ByteBuffer.java:859)
at org.mariadb.jdbc.internal.packet.send.SendExecutePrepareStatementPacket.send(SendExecutePrepareStatementPacket.java:105)
at org.mariadb.jdbc.internal.protocol.AbstractQueryProtocol.executePreparedQuery(AbstractQueryProtocol.java:578)
at org.mariadb.jdbc.MariaDbServerPreparedStatement.executeInternal(MariaDbServerPreparedStatement.java:279)
at org.mariadb.jdbc.MariaDbServerPreparedStatement.execute(MariaDbServerPreparedStatement.java:369)
at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:233)
如果有人能给我一些指示,请告诉我。或者面对类似的人。
答案 0 :(得分:0)
看起来非常像这个错误:CONJ-270
PreparedStatement
(参见您的堆栈跟踪) PreparedStatement
上有一个包含大量参数的错误所以我强烈建议您切换到常规 MySQL JDBC驱动程序广泛使用并经过全面测试。
除非您使用异国情调的软件知道你在做什么。