我遇到问题,我尝试处理S3中的数据,但是,某些json数据有错误,这实际上不是标准的json格式。因此,当我使用copy命令时,由于“发现不完整的JSON对象”而导致失败。 我已经找到并修复了导致错误格式的错误,但是我仍然需要在之前创建这些日志数据。
我尝试使用maxerror as 100
模式来避免失败,但是它不起作用!
wh=# copy performance1 from 's3://mybuct/new 3.txt'
credentials 'aws_access_key_id=id;aws_secret_access_key=key'
maxerror as 10
format as json 'auto';
INFO: Load into table 'log' completed, 1 record(s) could not be loaded. Check 'stl_load_errors' system table for details.
INFO: Load into table 'log' completed, 0 record(s) loaded successfully.
INFO: Load into table 'log' completed, 1 record(s) could not be loaded. Check 'stl_load_errors' system table for details.
redshift是否具有避免错误json行的参数?