View | Details | Raw Unified | Return to bug 6411
Collapse All | Expand All

(-)a/README.robots (-1 / +10 lines)
Lines 5-17 look for the file /robots.txt. If this file is found and has lines that apply Link Here
5
to them they will do as instructed.  A very basic robots.txt follow as an
5
to them they will do as instructed.  A very basic robots.txt follow as an
6
example:
6
example:
7
7
8
-------------------------------------------
8
# go away
9
# go away
9
User-agent: *
10
User-agent: *
10
Disallow: /
11
Disallow: /
12
-------------------------------------------
11
13
12
This tells every search engine that cares (User-agent: *) to not index the site
14
This tells every search engine that cares (User-agent: *) to not index the site
13
(Disallow everything past /).
15
(Disallow everything past /).
14
16
17
Another slightly more intelligent robots.txt file example allows for some bot indexing (good for your site in google, etc), but also stops your Koha from getting thrashing by ignoring URLs that cause heavy search load
18
19
-------------------------------------------
20
# do some indexing, but dont index search URLs
21
User-agent: *
22
Disallow: /cgi-bin/koha/opac-search.pl
23
-------------------------------------------
24
15
If you have installed Koha to /usr/local/koha3 then this file would be placed
25
If you have installed Koha to /usr/local/koha3 then this file would be placed
16
in the directory /usr/local/koha3/opac/htdocs/.  This should prevent search
26
in the directory /usr/local/koha3/opac/htdocs/.  This should prevent search
17
engines from browsing every biblio record, and every view of each record, on
27
engines from browsing every biblio record, and every view of each record, on
18
- 

Return to bug 6411