添加robots.txt到Django网站项目的方法

类似于robots.txt文件的url设置一般有几个方法,我测试可用的就是把txt文件放templates文件,再写urls.py

url(r'^oauth/MP_verify_ZrapWLFI0Fq2bplZ.txt$', TemplateView.as_view(template_name= 'MP_verify_ZrapWLFI0Fq2bplZ.txt', content_type='text/plain')),

下面是一个参考的设置方法:

Three ways to add a robots.txt to your Django project[1]
Need to add a robots.txt[2] file to your Django project to tell Google and friends what and what not to index on your site?
Here are three ways to add a robots.txt file to Django.
1) The (almost) one-liner
In an article[3] on e-scribe.com, Paul Bissex suggest to add this rule to your urls.py file:
from django.http import HttpResponse
urlpatterns = patterns(”,

(r’^robots.txt$’, lambda r: HttpResponse(“User-agent: *nDisallow: /”, mimetype=”text/plain”))
)
The advantage of this solution is, it is a simple one-liner disallowing all bots, with no extra files to be created, and no clutter anywhere. It’s as simple as it gets.
The disadvantage, obviously, is the missing scalability. The instant you have more than one rule to add, this approach quickly balloons out of hand. Also, one could argue that urls.py is not the right place for content of any kind.
2) Direct to template
This one is the most intuitive approach: Just drop a robots.txt file into your main templates directory and link to it via direct_to_template:
For Django version < 1.3[4]:
from django.views.generic.simple import direct_to_template
urlpatterns = patterns(”,

(r’^robots.txt$’, direct_to_template,
{‘template’: ‘robots.txt’, ‘mimetype’: ‘text/plain’}),
)
For Django version >= 1.3:
from django.views.generic import TemplateView

urlpatterns = patterns(”,

(r’^ robots.txt$’, TemplateView.as_view(template_name= robots.txt’, content_type=’text/plain’)),
)
Just remember to set the MIME type appropriately to text/plain, and off you go.
Advantage is its simplicity, and if you already have a robots.txt file you want to reuse, there’s no overhead for that.
Disadvantage: If your robots file changes somewhat frequently, you need to push changes to your web server every time. That can get tedious. Also, this approach does not save you from typos or the like.
3) The django-robots app
Finally, there’s a full-blown django app available that you can install and drop into your INSTALLED_APPS: It is called django-robots[5].
For small projects, this would be overkill, but if you have a lot of rules, or if you need a site admin to change them without pushing changes to the web server, this is your app of choice.
Which one is right for me?
Depending on how complicated your rule set is, either one of the solutions may be the best fit for you. Just choose the one that you are the most comfortable with and that fits the way you are using robots.txt in your application.

木易的技术记录 » 添加robots.txt到Django网站项目的方法

顶 (0)

评论 0

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址

置顶文章