When configuring a bibliographic Z39.50 targets it's possible to list XSL scripts that change the record on import. While the option is also offered when adding authority Z39.50 targets, it doesn't work/hasn't been implemented. It would be great if this feature was available for both types of targets.
*** Bug 20431 has been marked as a duplicate of this bug. ***
Created attachment 114604 [details] [review] Bug 19220: Allow XSLT processing for Z39.50 authority targets Test plan: 1) Apply the patch 2) Edit an authority Z3950/SRU source in Home > Administration > Z39.50/SRU servers 3) Add the path to an XSLT file in the "XSLT File(s) for transforming results" input, and save 4) Remove the content of the import_records table to avoid cache issues 5) Search for an authority with the "New from Z39.50/SRU" button in authorities home 6) Check that the XSLT transformation has been applied, both in results list and in the import window Here is an example XSLT which removes the 801 field from authorities: <?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet xmlns:marc="http://www.loc.gov/MARC21/slim" version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/> <xsl:strip-space elements="*"/> <xsl:template match="@* | node()"> <xsl:copy> <xsl:apply-templates select="@* | node()"/> </xsl:copy> </xsl:template> <xsl:template match="marc:datafield[@tag='801']"> </xsl:template> </xsl:stylesheet>
I didn't add any new unit tests, since t/db_dependent/Breeding.t still passes with this patch, and _do_xslt_proc is already tested in it.
Hi Matts, I am sorry, but I couldn't get this to work. :( 1) Applied patch 2) Restart all 3) Added a file to intranet-tmpl/prog/en/xslt/Del040.xsl using your template. Only change I made was 801 to 040. 4) Updated configuration for Z39.50 LOC for Names adding "Del040.xsl" to the last config option. 5) Did a search for "Müller". Checked MARC preview - 040 is shown. 6) Imported the record: 040 is there :( Can you spot the mistake or verify it still works for you?
Hi, I've tested this again on master today, and it still works as expected as far as I'm concerned. Two things that comes to mind: - You don't mention emptying the import_records table. Did you do it? Otherwise, if you select a previously searched record, the cached version will be used and the XSLT transformation will not happen. - Did you use the complete pathname when editing the Z39 server ? (ie: /home/koha/src/koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl and not koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl or Del040.xsl)
(In reply to Matthias Meusburger from comment #5) > Hi, > > I've tested this again on master today, and it still works as expected as > far as I'm concerned. > > Two things that comes to mind: > > - You don't mention emptying the import_records table. Did you do it? > Otherwise, if you select a previously searched record, the cached version > will be used and the XSLT transformation will not happen. I did a quite random search, I don't think that it was in the import table or at least it would be quite unlikely. > - Did you use the complete pathname when editing the Z39 server ? (ie: > /home/koha/src/koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl and not > koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl or Del040.xsl) I used the file name, as I hat put my sample one in the same directory as the existing ones listed in the documentation for bibliopgraphic Z39.50. Is this not expected to work? Changing status so another person could give this a shot.
> > I did a quite random search, I don't think that it was in the import table > or at least it would be quite unlikely. Well, old entries in import_records are only deleted by misc/cronjobs/cleanup_database.pl as far as I know. So if you don't run this script at all, you can have very old entries. Oh, and is "Müller" such a random search, since it's the most common family surname in Germany? :) > I used the file name, as I hat put my sample one in the same directory as > the existing ones listed in the documentation for bibliopgraphic Z39.50. > Is this not expected to work? You're absolutely right, as stated in https://koha-community.org/manual/20.11/en/html/administration.html#add-a-z39-50-target , a simple filename will be searched in /koha-tmpl/intranet-tmpl/prog/en/xslt/ , among other directories by the _get_best_default_xslt_filename function. I just tested this myself, and it worked. So perhaps it was the import_records table that should have been emptied? Another sidenote: You won't see the transformation in the preview window. This is debatable, but this is the way it already works for bibliographic records, and one can argue that the preview is meant to see how the record looks like in its source catalogue. Anyway, thanks for testing :)
Sorry, Matts, I can't get this to work. - I've made sure to search for all different terms - The field I want to delete is still there after I imported the record We really need someone else to give this a try.
Dear Katrin, Matthias is right. I just tested it and it does work. What I have noticed, is that for bibliographic records, the import script re-transforms the output XML EVERY time (even if the record is cached from a previous search). For authority records, it prefers to fetch the cached version, as is, from the with the last transformation made (or not made if you didn't specify a custom XSLT). So, you have to run the cleanup_database.pl tool like this (assuming that you have a package installation): koha-foreach --chdir --enabled /usr/share/koha/bin/cronjobs/cleanup_database.pl --confirm --z3950 in order to clear the cached entries from z3950 searches. Otherwise you will always see the last result. I tested it and if I did not run the cleanup tool, the XSL didn't do anything. After I ran it, everytime I had the "XSL-updated" results. Even if you remove the custom XSL from the Z3950 AUTH SERVER setting, then you have to rerun the script for it to work, otherwise it will still display the "XSL-updated" results from your previous queries. I hope that this helps you, -Fk
Matts, reading the comment above I feel like the caching will be an issue here - it will cause the transformation to to not work reliably as I have struggled to much with it. And I feel like this will be very confusing to libraries too that want to use the feature. But could we make it work "always", even if you happen to have "seen" the record before? I get that this would also be consistent with how it works for bibliographic records. Just to make sure: My last test was done on a fresh sample database with lots of different names, so I believe there might be another cause too, even if everyone tells me otherwise :)
Katrin, Just one more thought on that, since you mention that the failure was consistent in your case. Did you always checked the search to be done *only* on the Z3950 server on which you have configured the XSLT? Because everytime the search window is reopened all default search servers are pre-selected.
I have 2 authority targets configured, but made sure to only select results from the configured one with the XSLT transformation.
Created attachment 121897 [details] [review] Bug 19220 - Allow XSLT processing for Z39.50 authority Matthias and Katrin, I worked on it a bit and I seem to have found where the issue might be. It is in the Breeding.pm module, where for the Authorities if an import_record_id already exists in the import_records table for that authority in the import_auths table, then it simply returns the import_record_id, ignoring the new MARC data that have been returned from the new XSL transformation. If the auth record does not exist in the import_auths table, then the AddAuthToBatch() is called, which adds the auth into import_records, with the marc and marcxml correctly. In the case of biblios, it always runs the AddBiblioToBatch(), even if the biblio is already in the cache, returning the new id, hence the new marc. Please try the patch I have uploaded, where I update the import_records table for the specific import_record_id, with the new marc values (plain marc and marcxml), since they ae both used later on in the Marc Preview and import functionalities.
Created attachment 121906 [details] [review] Bug 19220- Allow XSLT processing for Z39.50 authority targets like for bibliographic targets A minor correction in the previous patch in Breedin.pm, where it retrieves the marcflavour from preferences. The "shift" operation is not required
Created attachment 121907 [details] [review] Bug 19220 Allow XSLT processing for Z39.50 authority version 1
Any news ? Are the 2 new patches an alternate ?
Dear Fridolin Goodmorning, As far as I am concerned, I have tested it in branches 20.x and 21.x and it works as expected.
Needs a rebase on current master
I suppose _do_xslt_proc() can be called just after SetUTF8Flag()
Arf for UNIMARC there is a strange behavior, we had to patch to always set MARCFlavor : https://git.biblibre.com/biblibre/kohac/commit/cdb6e4cd1d56ae2904f7f722f21ba0f424a82190
Created attachment 172627 [details] [review] Bug 19220: Allow XSLT processing for Z39.50 authority targets Test plan: 1) Apply the patch 2) Edit an authority Z3950/SRU source in Home > Administration > Z39.50/SRU servers 3) Add the path to an XSLT file in the "XSLT File(s) for transforming results" input, and save For instance: <path_to_src>/koha-tmpl/intranet-tmpl/prog/en/xslt/Bug19220.xsl 4) Search for an authority with the "New from Z39.50/SRU" button in authorities home 5) Check that the XSLT transformation has been applied, both in results list and in the import window 6) Edit the Z3950/SRU source to remove the path to the XSLT file 7) Search again for the same authority, and check that no transformation has been applied 8) prove t/db_dependent/Breeding_Auth.t Here is an example XSLT which adds a 035$a field: <?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet xmlns:marc="http://www.loc.gov/MARC21/slim" version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/> <xsl:template match="record|marc:record"> <record> <xsl:apply-templates/> <datafield tag="035" ind1='' ind2=''> <subfield code="a"> <xsl:text>XSLT added field</xsl:text> </subfield> </datafield> </record> </xsl:template> <xsl:template match="node()"> <xsl:copy select="."> <xsl:copy-of select="@*"/> <xsl:apply-templates/> </xsl:copy> </xsl:template> </xsl:stylesheet>
Created attachment 172628 [details] [review] Bug 19220: Allow XSLT processing for Z39.50 authority targets Test plan: 1) Apply the patch 2) Edit an authority Z3950/SRU source in Home > Administration > Z39.50/SRU servers 3) Add the path to an XSLT file in the "XSLT File(s) for transforming results" input, and save For instance: <path_to_src>/koha-tmpl/intranet-tmpl/prog/en/xslt/Bug19220.xsl 4) Search for an authority with the "New from Z39.50/SRU" button in authorities home 5) Check that the XSLT transformation has been applied, both in results list and in the import window 6) Edit the Z3950/SRU source to remove the path to the XSLT file 7) Search again for the same authority, and check that no transformation has been applied 8) prove t/db_dependent/Breeding_Auth.t Here is an example XSLT which adds a 035$a field: <?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet xmlns:marc="http://www.loc.gov/MARC21/slim" version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/> <xsl:template match="record|marc:record"> <record> <xsl:apply-templates/> <datafield tag="035" ind1='' ind2=''> <subfield code="a"> <xsl:text>XSLT added field</xsl:text> </subfield> </datafield> </record> </xsl:template> <xsl:template match="node()"> <xsl:copy select="."> <xsl:copy-of select="@*"/> <xsl:apply-templates/> </xsl:copy> </xsl:template> </xsl:stylesheet>
Hi! I've reworked this patch: - Merged all the patches in one (Thanks Filippos Kolovos for your contribution) - Updated the test plan - Wrote unit tests specifically to check that we don't run into the caching issues you've mentioned earlier, Katrin. It's ready to be tested again :) To the QA team: the qa script tells me I've added 2 messy lines, even though I've ran Tidy on the code I added. I don't know why, sorry about that.
Created attachment 173438 [details] [review] Bug 19220: Allow XSLT processing for Z39.50 authority targets Test plan: 1) Apply the patch 2) Edit an authority Z3950/SRU source in Home > Administration > Z39.50/SRU servers 3) Add the path to an XSLT file in the "XSLT File(s) for transforming results" input, and save For instance: <path_to_src>/koha-tmpl/intranet-tmpl/prog/en/xslt/Bug19220.xsl 4) Search for an authority with the "New from Z39.50/SRU" button in authorities home 5) Check that the XSLT transformation has been applied, both in results list and in the import window 6) Edit the Z3950/SRU source to remove the path to the XSLT file 7) Search again for the same authority, and check that no transformation has been applied 8) prove t/db_dependent/Breeding_Auth.t Here is an example XSLT which adds a 035$a field: <?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet xmlns:marc="http://www.loc.gov/MARC21/slim" version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/> <xsl:template match="record|marc:record"> <record> <xsl:apply-templates/> <datafield tag="035" ind1='' ind2=''> <subfield code="a"> <xsl:text>XSLT added field</xsl:text> </subfield> </datafield> </record> </xsl:template> <xsl:template match="node()"> <xsl:copy select="."> <xsl:copy-of select="@*"/> <xsl:apply-templates/> </xsl:copy> </xsl:template> </xsl:stylesheet> Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Works in testing for me.. :)