Javascript中的模板标记

时间:2015-06-09 23:39:31

标签: javascript django django-templates django-template-filters

我的Django模板标签在我的javascript中无效。我的最新错误是:SyntaxError: expected expression, got '&' var resourceTypes = ['Structural Model', 'X-Ray Diffraction']

我怎样才能让它发挥作用?我需要将这些Django变量放到js中,以便我可以创建一个图表(我正在使用Google Charts)

的index.html

<script>
  function drawChart() {

    // Create the data table.
    var data = new google.visualization.DataTable();
    data.addColumn('string');
    data.addColumn('number');

    var resourceTypes = {{ "all"|resource_types }}

    {% for x in resourceTypes %}
        data.addRows([
          [x, {{ x|resourceType_count }}],
        ]);
    {% endfor %}

    // Set chart options
    var options = {'title':'Datasets by Type',
                   'width':400,
                   'height':300};

    // Instantiate and draw our chart, passing in some options.
    var chart = new google.visualization.PieChart(document.getElementById('chart_div'));
    chart.draw(data, options);
  }

</script>

templatetags.py

@register.filter(name='resource_types')
def resource_types(data_type):
    resourceTypes = [ str(x.data_type) for x in ResourceType.objects.all() ]
    return resourceTypes

@register.filter(name='resourceType_count')
def resourceType_count(data_type):
    count = Dataset.objects.filter(data_type=ResourceType.objects.get(data_type=data_type)).count()
    return count

1 个答案:

答案 0 :(得分:0)

您可以考虑使用分配标记:

from collections import Counter

@register.assignment_tag(takes_context=True)
def get_resource_types(context):
    values = dict(Counter(list(map(str, ResourceType.objects.values_list('data_type', flat=True)))))
    return {'resource_types': values}

这将为您提供值列表中每个data_type字符串的计数,例如:

{'data type 1': 3, 'data type 2': 10, 'data type 3': 47}

然后您可以将其传递给.addRows()函数:

{% get_resource_types as resource_types %}

data.addRows([
    {% for data_type, count in resource_types.items %}
    ['{{ data_type }}', {{ count }}],
    {% endfor %}
]);

这应该允许您在单个数据库查询中执行所有操作,而不必查询每个查询。您也可以使用查询集上每种类型的聚合计数来执行此操作。根据我们谈论的数据量,我无法确定哪一个会更快。