ng-upload-file和django rest框架使用filefield发布和更新数据

时间:2018-08-13 10:49:11

标签: angularjs django-models django-rest-framework

  1. 使用django rest框架作为后端,使用angularjs作为前端
  2. 后端

    class ReleaseBugs(models.Model):
        bug_name = models.CharField(max_length=30)
        bug_image = models.ImageField(blank=True, null=True, upload_to='release_bugs/')
    
    class ReleaseBugsSerializer(serializers.ModelSerializer):
        class Meta:
            model = ReleaseBugs
            fields = '__all__'
    
  3. 前端

    Upload.upload({
            method: 'PUT',
            url: api_url + '/' + data.id + '/' ,
            data: data
        }).then();
    
    Upload.upload({
            method: 'POST',
            url: api_url + '/' + data.id + '/' ,
            data: data
        }).then();
    
  4. 问题 当第一次上传图像,完成后,后端将bug_image转换为url,而不是当我想要修改数据(例如bug_name)并再次发布时,这是被禁止的。 我有一个解决方案,当重新发布数据时,使用ng-upload-file将url转换为blob

    Upload.urlToBlob($scope.release_bug.bug_image).then(function(blob) {
                    $scope.release_bug.bug_image = blob;
                    // do post
                });
    

    但是,这种方式只是上传具有不同文件名的相同图像。我认为这是浪费时间和空间。

我是新手,有更好的方法吗?

1 个答案:

答案 0 :(得分:0)

提出了解决方案,只需在视图集更新时进行更新,而不使用Serilizer

org.apache.spark.SparkException: Task failed while writing rows
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:270)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$apply$mcV$sp$1.apply(FileFormatWriter.scala:189)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$apply$mcV$sp$1.apply(FileFormatWriter.scala:188)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
    at org.apache.spark.scheduler.Task.run(Task.scala:108)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
    at com.geaviation.pivotTypes.LeftJoinTty$$anonfun$main$1$$anonfun$3.apply(LeftJoinTty.scala:145)
    at com.geaviation.pivotTypes.LeftJoinTty$$anonfun$main$1$$anonfun$3.apply(LeftJoinTty.scala:128)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:324)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:256)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:254)
    at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1371)
    at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:259)
    ... 8 more