插入dynamodb时无法解析日期错误?

时间:2018-05-02 14:14:40

标签: java amazon-web-services amazon-s3 amazon-cloudwatch

我正在AWS中做一个小POC。我试图从S3存储桶读取一个csv文件,需要将这些csv文件数据插入到dynamodb中。一切都很顺利,但在第一行本身迭代csv文件数据时,它终止了。

我创建了一个表,即dynamodb内的vehicledata,有一个主键,即veh_price_id。

在得到问题之后,一切都很顺利,直到“我在圈内”。

编码

<TestMethod()>
 Public Sub Test()
    //Arrange

    //Mock User.IsInRole():
    Dim fakePrincipal = New Moq.Mock(Of IPrincipal)()
    fakePrincipal.Setup(Function(p) p.IsInRole("Agent")).Returns(True)
    Thread.CurrentPrincipal = fakePrincipal.[Object]

    'Dim fakeClaimPrincipal = New Mock(Of ClaimsPrincipal)()
    'Dim claims As IEnumerable(Of Claim) = New List(Of Claim) ().AsEnumerable()
    'fakeClaimPrincipal.Setup(Sub(e) e.Claims).Returns(claims)
    'Thread.CurrentPrincipal = fakeClaimPrincipal.[Object]

    //Mocking is not working on ClaimsPrincipal.FindFirst(), so it's fixed 
      by using claim based function:

    Thread.CurrentPrincipal = New TestPrincipal(New Claim(ClaimTypes.Name, "user@hotmail.com"))

    Public Class TestPrincipal
        Inherits ClaimsPrincipal
        Public Sub New(ParamArray claims As Claim())
               MyBase.New(New TestIdentity(claims))
        End Sub
    End Class

    Public Class TestIdentity
         Inherits ClaimsIdentity
         Public Sub New(ParamArray claims As Claim())
                MyBase.New(claims)
         End Sub
    End Class

   // Act
   Dim result = TestController.GetUserDetail()

 End Sub

Helper.java

公共班助手{

public class LambdaFunctionHandler implements RequestHandler<S3Event, Report> {

Region AWS_REGION = Region.getRegion(Regions.US_EAST_1);
String DYNAMO_TABLE_NAME = "vehicledata";

public Report handleRequest(S3Event s3event, Context context) {
long startTime = System.currentTimeMillis();
Report statusReport = new Report();
LambdaLogger logger = context.getLogger();


Helper helper = new Helper();

try {

    S3EventNotificationRecord record = s3event.getRecords().get(0);
    String srcBucket = record.getS3().getBucket().getName();
    String srcKey = record.getS3().getObject().getKey().replace('+', ' ');

    srcKey = URLDecoder.decode(srcKey, "UTF-8");
    AmazonS3 s3Client = new AmazonS3Client();
    S3Object s3Object = s3Client.getObject(new GetObjectRequest(srcBucket, srcKey));
    logger.log("I am inside lambda function8");
    statusReport.setFileSize(s3Object.getObjectMetadata().getContentLength());
    logger.log("I am inside lambda function9");

    logger.log("S3 Event Received: " + srcBucket + "/" + srcKey);
    logger.log("I am inside lambda function10");

  BufferedReader br = new BufferedReader(new InputStreamReader(s3Object.getObjectContent())); 

        logger.log("I am inside lambda function13");
        CSVReader reader = new CSVReader(br);
        logger.log("I am inside lambda function14");

        AmazonDynamoDB dynamoDBClient = new AmazonDynamoDBClient();

        dynamoDBClient.setRegion(AWS_REGION);
        DynamoDB dynamoDB = new DynamoDB(dynamoDBClient);
        logger.log("I am inside DynamoDB");
        TableWriteItems energyDataTableWriteItems = new TableWriteItems(DYNAMO_TABLE_NAME);
        logger.log("I am inside DynamoDB-Table");
        List<Item> itemList = new ArrayList<Item>();

        String[] nextLine;
        while ((nextLine = reader.readNext()) != null) {
            logger.log("I am inside while loop");
            Item newItem = helper.parseIt(nextLine);

            itemList.add(newItem);
        }

        for (List<Item> partition : Lists.partition(itemList, 25)) {
            energyDataTableWriteItems.withItemsToPut(partition);
            BatchWriteItemOutcome outcome = dynamoDB.batchWriteItem(energyDataTableWriteItems);
            logger.log("I am inside for loop");
            do {                

                Map<String, List<WriteRequest>> unprocessedItems = outcome.getUnprocessedItems();

                if (outcome.getUnprocessedItems().size() > 0) {
                    logger.log("Retrieving the unprocessed " + String.valueOf(outcome.getUnprocessedItems().size())
                            + " items.");
                    outcome = dynamoDB.batchWriteItemUnprocessed(unprocessedItems);
                }

            } while (outcome.getUnprocessedItems().size() > 0);

        }

        logger.log("Load finish in " + String.valueOf(System.currentTimeMillis() - startTime) + "ms");

        reader.close();
        br.close();
        //isr.close();
        //gis.close();
        s3Object.close();

        statusReport.setStatus(true);
    } catch (Exception ex) {
        logger.log(ex.getMessage());
    }

    statusReport.setExecutiongTime(System.currentTimeMillis() - startTime);
    return statusReport;
  }

 }

在CloudWatch控制台中,输出看起来像 enter image description here

我的csv文件数据类似于

enter image description here

谁能告诉我在哪里做错了?如何解决这个问题?

0 个答案:

没有答案