如何将站点地图集成到django-oscar?

时间:2019-07-05 20:54:31

标签: python django robots.txt django-oscar django-sitemaps

我想为我的oscar项目集成站点地图,并且我使用了django-sitemaps软件包 我也照着做,在主urls.py文件中创建了一个名为sitemaps-django的应用程序,配置了views.py,sitemaps-django / pages / sitemaps.py,URLconf

Views.py

from sitemaps_django.pages.sitemaps import PagesSitemap

def sitemap(request):
    sitemap = Sitemap(
        build_absolute_uri=request.build_absolute_uri,
    )

    # URLs can be added one-by-one. The only required argument
    # is the URL. All other arguments are keyword-only arguments.
    for p in Page.objects.active():
        url = p.get_absolute_url()
        sitemap.add(
            url,
            changefreq='weekly',
            priority=0.5,
            lastmod=p.modification_date,
            alternates={
                code: urljoin(domain, url)
                for code, domain in PAGE_DOMAINS[p.language].items()
            },
        )

    # Adding conventional Django sitemaps is supported. The
    # request argument is necessary because Django's sitemaps
    # depend on django.contrib.sites, resp. RequestSite.
    sitemap.add_django_sitemap(PagesSitemap, request=request)

    # You could get the serialized XML...
    # ... = sitemap.serialize([pretty_print=False])
    # ... or use the ``response`` helper to return a
    # ready-made ``HttpResponse``:
    return sitemap.response(
        # pretty_print is False by default
        pretty_print=settings.DEBUG,
    )

sitemaps-django / pages / sitemaps.py

from django.urls import reverse_lazy
from django.conf.urls import url

urlpatterns = [
    url(r'^robots\.txt$', robots_txt(
        timeout=86400,
        sitemaps=[
            '/sitemap.xml',
            reverse_lazy('articles-sitemap'),
            ...,
        ],
    )),
]

urls.py

from django_sitemaps import robots_txt
from sitemaps_django.views import sitemap

urlpatterns = [
    url(r'^sitemap\.xml$', sitemap),
    url(r'^robots\.txt$', robots_txt(timeout=86400)),
    ...
]

我收到此错误

url(r'^robots\.txt$', robots_txt(
NameError: name 'robots_txt' is not defined

0 个答案:

没有答案