Bug 11190 - sitemap.pl -- Generate a catalog sitemap
Summary: sitemap.pl -- Generate a catalog sitemap
Status: CLOSED FIXED
Alias: None
Product: Koha
Classification: Unclassified
Component: Command-line Utilities (show other bugs)
Version: Main
Hardware: All All
: P5 - low new feature (vote)
Assignee: Frédéric Demians
QA Contact: Jonathan Druart
URL:
Keywords: dependency
Depends on: 6594
Blocks: 16016 16031
  Show dependency treegraph
 
Reported: 2013-11-03 12:35 UTC by Frédéric Demians
Modified: 2017-06-14 22:08 UTC (History)
11 users (show)

See Also:
Change sponsored?: ---
Patch complexity: Small patch
Documentation contact:
Documentation submission:
Text to go in the release notes:
Version(s) released in:


Attachments
Proposed patch (8.74 KB, patch)
2013-11-03 16:12 UTC, Frédéric Demians
Details | Diff | Splinter Review
Proposed patch (9.29 KB, patch)
2013-11-04 06:30 UTC, Frédéric Demians
Details | Diff | Splinter Review
Proposed patch v2 (11.07 KB, patch)
2014-01-02 12:18 UTC, Frédéric Demians
Details | Diff | Splinter Review
Proposed patch v3 (11.71 KB, patch)
2014-01-02 21:51 UTC, Frédéric Demians
Details | Diff | Splinter Review
Proposed patch v4 (11.78 KB, patch)
2014-01-06 20:16 UTC, Frédéric Demians
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (11.88 KB, patch)
2014-01-06 21:50 UTC, Magnus Enger
Details | Diff | Splinter Review
Fix AnyEvent::Processor error (824 bytes, patch)
2014-01-17 16:09 UTC, Frédéric Demians
Details | Diff | Splinter Review
Bug 11190: [SIGNED-OFF] Fix AnyEvent::Processor dependency error (915 bytes, patch)
2014-03-12 10:15 UTC, Magnus Enger
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (12.16 KB, patch)
2014-12-02 11:40 UTC, Frédéric Demians
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (12.12 KB, patch)
2014-12-02 12:12 UTC, Frédéric Demians
Details | Diff | Splinter Review
[Passed QA] Bug 11190 sitemap.pl -- Generate a Catalog sitemap (12.20 KB, patch)
2014-12-03 11:02 UTC, Martin Renvoize
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (11.92 KB, patch)
2015-04-16 07:03 UTC, Frédéric Demians
Details | Diff | Splinter Review
[SIGNED-OFF] Bug 11190: sitemap.pl -- Generate a Catalog sitemap (12.06 KB, patch)
2015-05-31 19:36 UTC, Bernardo Gonzalez Kriegel
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (17.82 KB, patch)
2015-08-20 17:32 UTC, Frédéric Demians
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (17.82 KB, patch)
2015-08-20 17:34 UTC, Frédéric Demians
Details | Diff | Splinter Review
Bug 11190 sitemap.pl -- Generate a Catalog sitemap (17.93 KB, patch)
2015-08-26 14:04 UTC, Jonathan Druart
Details | Diff | Splinter Review
QA followup for tests on jenkins (1.70 KB, patch)
2015-08-31 13:26 UTC, Frédéric Demians
Details | Diff | Splinter Review

Note You need to log in before you can comment on or make changes to this bug.
Description Frédéric Demians 2013-11-03 12:35:33 UTC
Process all biblio records from a Koha instance and generate Sitemap
files complying with this protocol as described on
http://sitemaps.org. The goal of this script is to be able to provide
to search engines direct access to biblio records. It avoid leaving
search engine browsing Koha OPAC and so generating a lot of traffic, and
workload, for a bad result.
Comment 1 Frédéric Demians 2013-11-03 16:12:21 UTC Comment hidden (obsolete)
Comment 2 Frédéric Demians 2013-11-04 06:30:33 UTC Comment hidden (obsolete)
Comment 3 Magnus Enger 2014-01-01 21:09:59 UTC
1. Typo in the module name, I think Koha::Sitemaper should be Koha::Sitemapper.
2. XML::Writer is not added to C4::Installer::PerlDependencies.
3. The POD for the script talks about generating files called sitemapindex.xml and sitemapXXXX.xml, but as far as I can see, all output is written to STDOUT and has to be split into different files by hand?
4. XML documents output by the script are missing the <?xml ... ?> declarations, but they might be optional? 
5. No tests for the stuff in Koha::Sitemaper and Koha::Sitemaper::Writer? 

Suggestions:

6. It could perhaps be said explicitly that the --url parameter shuld not include the trailing slash?
7. Could the base URL be taken from the OPACBaseURL syspref instead of being a command line parameter? 
8. URLs in the sitemap-file are on the form: http://example.org/bib/171 Some sites might have disabled the Apache rewrite rules that make these URLs functional, so it might be better to have the style /cgi-bin/koha/opac-detail.pl?biblionumber=172 as the default and make the shorter form available through a command line switch? 

I'm marking this "Failed QA" because of number 2 above. But this would make a very cool addition to Koha, so I hope the issues can be fixed! :-)
Comment 4 Frédéric Demians 2014-01-02 09:11:41 UTC
Thanks Magnus for your feedback. Point 3 is definitely abnormal.
Comment 5 Frédéric Demians 2014-01-02 12:18:56 UTC Comment hidden (obsolete)
Comment 6 Paul Poulain 2014-01-02 15:16:48 UTC
QA preliminary comment : I thought we wanted not to use Moose, and were favoring Class::Accessor
Comment 7 Frédéric Demians 2014-01-02 16:51:43 UTC
(In reply to Paul Poulain from comment #6)
> QA preliminary comment : I thought we wanted not to use Moose, and were
> favoring Class::Accessor

You're correct, but I think it's rather for CGI scripts which can't support Moose start-up penalty (will be possible with Plack). It's not a problem for a CLI.
Comment 8 Magnus Enger 2014-01-02 19:45:15 UTC
Thanks for fixing a lot of issues with this script! I have another couple of suggestons:

In regards to issue 3, the problem was on my side: I was executing the script through koha-shell, like so:

 $ sudo koha-shell -c "perl misc/cronjobs/sitemap.pl -h" kohadev

This meant that the script was actually beig run as the kohadev-koha user (or something like that), which did not have write permission to the dir I was executing the script in. This caused the XML to be written to STDOUT, instead of to a file. Would it be hard to add an (optional) switch for specifying where the output goes? 

If sitemap.pl is run without any arguments, it will create the sitemap files. I seem to remember we have a convention that scripts run without arguments should print a help message and exit, but it does not seem to be in the coding guidelines... Maybe add a --run parameter to actually run the script?
Comment 9 Frédéric Demians 2014-01-02 21:51:56 UTC Comment hidden (obsolete)
Comment 10 Magnus Enger 2014-01-06 19:23:50 UTC
(In reply to Frédéric Demians from comment #9)
> > If sitemap.pl is run without any arguments, it will create the sitemap
> > files. I seem to remember we have a convention that scripts run without
> > arguments should print a help message and exit, but it does not seem to be
> > in the coding guidelines... Maybe add a --run parameter to actually run the
> > script?
> 
> I'm not sure on this point. For me, a --run is required for scripts
> which alter database, and when there is a risk of data loose. This isn't
> the case here.

OK. 

This is really close now, but I have one last problem: It seems that if I run the script without the -dir parameter, no files are created in the current directory, and the script does not complain, with or without -v. 

But if I run the script with -dir and the absolute path to the current directory as an argument, the sitemap files ar created just fine. 

I am testing on a gitified package-install, so I have to use sudo to be able to read koha-conf.xml:

$ sudo pwd
/home/magnus/scripts/kohaclone

# No files are creatd in /home/magnus/scripts/kohaclone:
$ sudo KOHA_CONF=/etc/koha/sites/kohadev/koha-conf.xml perl misc/cronjobs/sitemap.pl -v 
Creation of Sitemap files in '.' directory
Number of biblio records processed: 172
Number of Sitemap files:            1

# Files *are* created in /home/magnus/scripts/kohaclone:
$ sudo KOHA_CONF=/etc/koha/sites/kohadev/koha-conf.xml perl misc/cronjobs/sitemap.pl -v -dir /home/magnus/scripts/kohaclone
Creation of Sitemap files in '/home/magnus/scripts/kohaclone' directory
Number of biblio records processed: 172
Number of Sitemap files:            1
Comment 11 Frédéric Demians 2014-01-06 20:16:53 UTC Comment hidden (obsolete)
Comment 12 Magnus Enger 2014-01-06 21:50:14 UTC Comment hidden (obsolete)
Comment 13 Jonathan Druart 2014-01-17 15:04:45 UTC
(In reply to Frédéric Demians from comment #7)
> (In reply to Paul Poulain from comment #6)
> > QA preliminary comment : I thought we wanted not to use Moose, and were
> > favoring Class::Accessor
> 
> You're correct, but I think it's rather for CGI scripts which can't support
> Moose start-up penalty (will be possible with Plack). It's not a problem for
> a CLI.

Galen, could you confirm it is not blocker for a push to master?
Comment 14 Kyle M Hall 2014-01-17 15:59:44 UTC
When I tried to test, the script couldn't find AnyEvent::Processor. Should this library be added as a dependency?
Comment 15 Frédéric Demians 2014-01-17 16:09:16 UTC Comment hidden (obsolete)
Comment 16 Magnus Enger 2014-03-12 10:15:09 UTC Comment hidden (obsolete)
Comment 17 Magnus Enger 2014-03-12 10:17:45 UTC
Just to avoid any confusion: I have now signed off both patches on this bug.
Comment 18 Alex Sassmannshausen 2014-03-12 10:36:55 UTC
Hello,

I was studying this patch at the same time as Magnus: I ran the QA scripts and the current patches fail unfortunately.

(Looks easy to fix though: documentation and licensing?)

Hopes this helps,

Alex
Comment 19 Katrin Fischer 2014-12-02 10:27:46 UTC
Hi there, trying to see what the situation here is - are the QA tools still failing?
Comment 20 Katrin Fischer 2014-12-02 10:28:24 UTC
... to explain, can't test right now as I am away from my dev environment. Maybe someone can clarify before I can.
Comment 21 Frédéric Demians 2014-12-02 11:17:04 UTC
(In reply to Katrin Fischer from comment #20)
> ... to explain, can't test right now as I am away from my dev environment.
> Maybe someone can clarify before I can.

I can see QA errors. I will fix them and resubmit the patch with the status "Signed-off'.
Comment 22 Frédéric Demians 2014-12-02 11:40:35 UTC Comment hidden (obsolete)
Comment 23 Frédéric Demians 2014-12-02 12:12:15 UTC Comment hidden (obsolete)
Comment 24 Martin Renvoize 2014-12-03 11:02:53 UTC Comment hidden (obsolete)
Comment 25 Martin Renvoize 2014-12-03 11:04:16 UTC
This now passes the QA scripts for me, and appears to be generally ok code for a worthwhile feature. I'll leave the Moose debate to release manager as I'm a little confused on this front.
Comment 26 Paul Poulain 2014-12-03 11:26:32 UTC
a small question related to the sitemap size limit.
There are 2 limits : 50000 urls + 10MB maximum (unzipped).
Has anyone tested with 50000+ url and a "realistic" url site name.

I fear that 50k urls would result in more than 10MB file as our urls can be pretty long. For example : http://koha.mediathequeouestprovence.fr/cgi-bin/koha/opac-detail.pl?biblionumber=135438 only for the URL, add headers, timestamps, wouldn't that make a >10MB file ?


About the Moose thing: I'm really annoyed. On one hand, this script is very useful and worth being pushed, no doubt. But on the other hand, it add the Moose requirement (although not mandatory), that we removed some months ago.
My opinion would be: "OK, let's push this patch, but add a FIXME and a coding guideline to prevent any script like this to be pushed/submitted".

(and the "CLI script" argument seems an invalid one to me: otherwise, we could add many depenencies, and that would not be good)
Comment 27 Paul Poulain 2014-12-03 11:36:46 UTC
suggestion for a future improvement : be able to add a WHERE clause to the biblio that are sitemap generated.
This where could rely on biblio/biblioitems/items parameters.
For example, a library could not publish records related to libraryX, or exclude serials, or sitemap only items published before YYYY
Comment 28 Paul Poulain 2014-12-03 11:41:24 UTC
Additional (final ?) comment: there is no changefreq definition. I can't find the default value if you don't specify it, but I think that a monthly or even yearly changefreq is better for a search engine (and be careful, for large catalogue, if we put a too frequent period, the server will be crashed by multiple requests. We had to disallow search engines using robots.txt on many libraries !
Comment 29 Frédéric Demians 2014-12-03 11:59:32 UTC
(In reply to Martin Renvoize from comment #25)
> This now passes the QA scripts for me, and appears to be generally ok code
> for a worthwhile feature. I'll leave the Moose debate to release manager as
> I'm a little confused on this front.

When you shut the door, Moose is the kind of thing that enter by the back door... Moo could be used instead. I don't see the advantage, nor the problem with Moose which is such a common dependency in the Perl ecosystem, and which is properly packaged in Debian (and others) now.
Comment 30 Frédéric Demians 2014-12-03 12:06:50 UTC
(In reply to Paul Poulain from comment #26)
> a small question related to the sitemap size limit.
> There are 2 limits : 50000 urls + 10MB maximum (unzipped).
> Has anyone tested with 50000+ url and a "realistic" url site name.



> 
> I fear that 50k urls would result in more than 10MB file as our urls can be
> pretty long. For example :
> http://koha.mediathequeouestprovence.fr/cgi-bin/koha/opac-detail.
> pl?biblionumber=135438 only for the URL, add headers, timestamps, wouldn't
> that make a >10MB file ?

I've just tested, for this Koha site URL. A 50000 URL file:

- short version = 5.6M
- long versoin = 7.3M

> (and the "CLI script" argument seems an invalid one to me: otherwise, we
> could add many depenencies, and that would not be good)

It's not like Moose was a exotic, unmaintained module.
Comment 31 Frédéric Demians 2014-12-03 12:14:04 UTC
(In reply to Paul Poulain from comment #28)
> Additional (final ?) comment: there is no changefreq definition. I can't
> find the default value if you don't specify it, but I think that a monthly
> or even yearly changefreq is better for a search engine 

The web crawlers download the sitemap files at an unknown frequency, and then compare the recent URL with the previous one, based on url's timestamp. Even if the sitemap is updated on the Koha server daily, the crawlers won't download it daily.

> (and be careful, for
> large catalogue, if we put a too frequent period, the server will be crashed
> by multiple requests. We had to disallow search engines using robots.txt on
> many libraries !

A sitemap helps crawlers not to be lost in a Koha catalog. Does it slow down requests sent by crawlers, we can't say or know. Empirically, I would say yes, but there is no way to be sure. Crawlers seem less tempted to follow multiple links at a rapid pace.
Comment 32 Paul Poulain 2014-12-03 13:03:47 UTC
(In reply to Frédéric Demians from comment #31)
> (In reply to Paul Poulain from comment #28)
> > Additional (final ?) comment: there is no changefreq definition. I can't
> > find the default value if you don't specify it, but I think that a monthly
> > or even yearly changefreq is better for a search engine 
> 
> The web crawlers download the sitemap files at an unknown frequency, and
> then compare the recent URL with the previous one, based on url's timestamp.
> Even if the sitemap is updated on the Koha server daily, the crawlers won't
> download it daily.
I think we're not talking of the same thing. http://www.sitemaps.org/protocol.html says 
===============================
 <changefreq> 	optional 	

How frequently the page is likely to change. This value provides general information to search engines and may not correlate exactly to how often they crawl the page. Valid values are:

    always
    hourly
    daily
    weekly
    monthly
    yearly
    never

The value "always" should be used to describe documents that change each time they are accessed. The value "never" should be used to describe archived URLs.

Please note that the value of this tag is considered a hint and not a command. Even though search engine crawlers may consider this information when making decisions, they may crawl pages marked "hourly" less frequently than that, and they may crawl pages marked "yearly" more frequently than that. Crawlers may periodically crawl pages marked "never" so that they can handle unexpected changes to those pages.
=================================

It can be defined for each page. My comment was related to each opac-detail.pl page description: we should say to the search engine that the page does not change often, so once it's indexed, there's no need to come back frequently.

HTH
Comment 33 Frédéric Demians 2014-12-03 13:33:42 UTC
(In reply to Paul Poulain from comment #32)

> I think we're not talking of the same thing.
> http://www.sitemaps.org/protocol.html says 
> ===============================
>  <changefreq> 	optional 	

Yes. I was referring to the script running frequency. The generated sitemap doesn't contain changefreq at all.
Comment 34 Katrin Fischer 2014-12-06 08:59:13 UTC
I think we agreed after some benchmarking not so use Moose, while Moo seemed to be ok. A problem is, that both are currently not dependencies in Koha and will have to be dealt with for the packaging. 
As this is an architecture patch, I think it might be worth getting Tomás' opinion and/or adding it to the next dev meeting's agenda.
Comment 35 Frédéric Demians 2015-04-16 07:03:23 UTC Comment hidden (obsolete)
Comment 36 Frédéric Demians 2015-04-16 07:12:58 UTC
Few personal notes about Moose vs Moo (just seen IRC dev meeting logs):

There is an overhead using Moose for small short lived programs. But for long-
running cronjob/maintenance scripts, there isn't any problem using Moose. For
Koha WUI scripts, Moose start time is a problem as long as Koha operates in
CGI mode. As soon as there is persistence (Plack), it isn't a problem anymore.

This issue of runtime penaly due to using a large and complex library, doing a
lot of stuff behind the scene, is not different with Moose and DBIx::Class  I
don't say DBIx::Class is bad. Moose it bad too => So let's go with Moose! I'd
say both libraries share a lot of good things.

I think that Moose has various advantages over other OO frameworks. It's
complete, very well documented, widely used, a standard, carefully packaged
for all major Linux distro. All books about the so-called "modern-Perl
movement" deals with using Moose as OO.  There is the Perl's famous moto:
There's more than one way to do it. Applied to OO, this could conduct to
chaos. There are so many ways to do OO in Perl. Code become unreadable and
unreliable. Moose forces to adopt a proper and consistent OO approach. Moo try
to solve a non-issue (library weight, startup time) in modern execution
environments at high price: too many important features are lost.
Comment 37 Bernardo Gonzalez Kriegel 2015-05-31 19:36:11 UTC Comment hidden (obsolete)
Comment 38 Jonathan Druart 2015-06-04 11:15:43 UTC
Frédéric,
Could you please provide tests and add the new deps to C4/Installer/PerlDependencies.pm?
Comment 39 Frédéric Demians 2015-06-04 16:08:31 UTC
(In reply to Jonathan Druart from comment #38)
> Frédéric,
> Could you please provide tests 

Do you mean more tests than the minimalist t/Sitemapper.t?

> and add the new deps to
> C4/Installer/PerlDependencies.pm?

Which one other than Moo & XML::Writer?
Comment 40 Jonathan Druart 2015-06-04 16:32:54 UTC
(In reply to Frédéric Demians from comment #39)
> (In reply to Jonathan Druart from comment #38)
> > Frédéric,
> > Could you please provide tests 
> 
> Do you mean more tests than the minimalist t/Sitemapper.t?

Yes, if possible.

> > and add the new deps to
> > C4/Installer/PerlDependencies.pm?
> 
> Which one other than Moo & XML::Writer?

Oops, sorry I missed it.
Comment 41 Frédéric Demians 2015-08-20 17:32:56 UTC Comment hidden (obsolete)
Comment 42 Frédéric Demians 2015-08-20 17:34:28 UTC Comment hidden (obsolete)
Comment 43 Frédéric Demians 2015-08-20 17:37:12 UTC
I have set back this patch status to 'Signed off'. The UT should suit QA needs.
Comment 44 Tomás Cohen Arazi 2015-08-25 01:30:34 UTC
(In reply to Frédéric Demians from comment #43)
> I have set back this patch status to 'Signed off'. The UT should suit QA
> needs.

I consider this passed QA, but as Jonathan asked about tests for the features, and the tests have been provided, I think it is Jonathan who should tell.
Comment 45 Jonathan Druart 2015-08-26 14:04:24 UTC
Created attachment 41989 [details] [review]
Bug 11190 sitemap.pl -- Generate a Catalog sitemap

Add a script sitemap.pl to process all biblio records from a Koha
instance and generate Sitemap files complying with this protocol as
described on http://sitemaps.org. The goal of this script is to be able
to provide to search engines direct access to biblio records. It avoid
leaving search engine browsing Koha OPAC and so generating a lot of
traffic, and workload, for a bad result.

Thanks Magnus for testing, and helping to improve the script design.

[2015.04.16] Switch from Moose to Moo.

[2015.08.20] Add complete (more) UT.

Signed-off-by: Magnus Enger <magnus@enger.priv.no>
All options to the script work as expected and the output looks
good. Nice enhancement!

Signed-off-by: Frederic Demians <f.demians@tamil.fr>

I signed-of my own patch after fixing various QA errors.

Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>

Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Amended patch: replace tabs with spaces.
Comment 46 Tomás Cohen Arazi 2015-08-26 14:13:01 UTC
Feature pushed to master.

Awesome work Frederic!
Comment 47 Frédéric Demians 2015-08-31 13:26:41 UTC
Created attachment 42119 [details] [review]
QA followup for tests on jenkins
Comment 48 Frédéric Demians 2015-11-30 11:14:10 UTC
This patch has been pushed to 3.20.x, will be in 3.20.6.
Comment 49 Katrin Fischer 2015-11-30 13:24:44 UTC
I am not sure about this one in 3.20 - it's listed as one of the 'new features' for 3.22 and usually new features should not be backported.
Comment 50 Paul Poulain 2015-11-30 13:31:24 UTC
I think it's a/a good idea and b/safe to backport this new feature.
Why is it safe ? because it's not interlaced with the rest of Koha. So no risk of side-effect.

I'm adding that the RMaint is also the sitemap writer, so he'll endorse god-fire in case of death of any kitten anywhere in the world
(good luck Fred ;-) )
Comment 51 Tomás Cohen Arazi 2015-11-30 13:37:36 UTC
(In reply to Paul Poulain from comment #50)
> I think it's a/a good idea and b/safe to backport this new feature.
> Why is it safe ? because it's not interlaced with the rest of Koha. So no
> risk of side-effect.
> 
> I'm adding that the RMaint is also the sitemap writer, so he'll endorse
> god-fire in case of death of any kitten anywhere in the world
> (good luck Fred ;-) )

The only problem are the dependencies, that should be updated for it to work, and will force the users to do a dist-upgrade in order to jump to the minor update.

It is uncomfortable, but has been done before, and it is Frederic's call as RMaint.
Comment 52 Frédéric Demians 2015-11-30 13:58:41 UTC
> The only problem are the dependencies, that should be updated for it to
> work, and will force the users to do a dist-upgrade in order to jump to the
> minor update.

The 2 new dependencies (XML::Writer & Moo) are not mandatory. So their installation isn't necessary until the new script is used.
Comment 53 Katrin Fischer 2015-11-30 19:45:52 UTC
If I understand Tomas correctly it's still a problem for the packages, even tho the dependencies are non-mandatory.
Comment 54 Julian Maurice 2015-12-04 11:04:08 UTC
Changed status to Pushed to Stable
Comment 55 Frédéric Demians 2015-12-04 12:51:40 UTC
(In reply to Katrin Fischer from comment #53)
> If I understand Tomas correctly it's still a problem for the packages, even
> tho the dependencies are non-mandatory.

In order to use the script: yes, otherwise: no.
Comment 56 Galen Charlton 2015-12-05 05:41:05 UTC
I have decided to exclude the patches for this bug from the package for 3.20.6, as I will not build packages if 'make test' fails... which it now does during package construction in 3.20.x.

This is because the new dependencies are required for Koha::Sitemap to compile (as is tested by t/00-load.t).  I will not add new dependencies because doing so would break apt-get upgrade for the oldstable stream.

It was contrary to custom to backport a new feature to a maintenance branch to begin with.  While, of course, there is room for some discretion on the part of the RMaint, for the new package maintainer -- i.e. me, adding new dependencies to the package for a maintenance release is a no-go for anything short of a major security issue that has no other plausible solution.

The sitemaps feature is not, of course, a security issue.  In the case of 3.14.10, which I think was the last time a dist-upgrade was forced for a maintenance release, the patches for bug 8044 at least had the promise of expanding the scope of I18N (although I note that that promise has yet to be fulfilled).

I realize that this puts folks in an awkward position.  I would prefer that 11190 be reverted from 3.20.x as a matter of policy; alternative approaches to deliver the functionality to users who are not yet ready to upgrade to 3.22.x include establishing a contrib package for the sitemap generator; and if needs must, I can revisit my decision for 3.20.7.
Comment 57 Frédéric Demians 2015-12-05 07:32:57 UTC
This patch has been reverted from 3.20.x.