robots
This commit is contained in:
parent
648f103f5d
commit
c456fe4ce9
20
robots.txt
Normal file
20
robots.txt
Normal file
@ -0,0 +1,20 @@
|
||||
#
|
||||
# robots.txt
|
||||
#
|
||||
# This file is to prevent the crawling and indexing of certain parts
|
||||
# of your site by web crawlers and spiders run by sites like Yahoo!
|
||||
# and Google. By telling these "robots" where not to go on your site,
|
||||
# you save bandwidth and server resources.
|
||||
#
|
||||
# This file will be ignored unless it is at the root of your host:
|
||||
# Used: http://example.com/robots.txt
|
||||
# Ignored: http://example.com/site/robots.txt
|
||||
#
|
||||
# For more information about the robots.txt standard, see:
|
||||
# http://www.robotstxt.org/robotstxt.html
|
||||
#
|
||||
# For syntax checking, see:
|
||||
# http://www.frobee.com/robots-txt-check
|
||||
|
||||
User-agent: *
|
||||
Crawl-delay: 10
|
Loading…
x
Reference in New Issue
Block a user