Bug 19220 - Allow XSLT processing for Z39.50 authority targets like for bibliographic targets
Summary: Allow XSLT processing for Z39.50 authority targets like for bibliographic tar...
Status: Failed QA
Alias: None
Product: Koha
Classification: Unclassified
Component: Z39.50 / SRU / OpenSearch Servers (show other bugs)
Version: Main
Hardware: All All
: P5 - low enhancement with 8 votes (vote)
Assignee: Matthias Meusburger
QA Contact: Testopia
URL:
Keywords: release-notes-needed
: 20431 (view as bug list)
Depends on: 22532
Blocks:
  Show dependency treegraph
 
Reported: 2017-08-31 08:53 UTC by Katrin Fischer
Modified: 2023-10-06 22:08 UTC (History)
5 users (show)

See Also:
Change sponsored?: ---
Patch complexity: ---
Documentation contact:
Documentation submission:
Text to go in the release notes:
Version(s) released in:


Attachments
Bug 19220: Allow XSLT processing for Z39.50 authority targets (2.32 KB, patch)
2020-12-22 14:53 UTC, Matthias Meusburger
Details | Diff | Splinter Review
Bug 19220 - Allow XSLT processing for Z39.50 authority (2.58 KB, patch)
2021-06-13 16:35 UTC, Filippos Kolovos
Details | Diff | Splinter Review
Bug 19220- Allow XSLT processing for Z39.50 authority targets like for bibliographic targets (1.02 KB, patch)
2021-06-14 06:54 UTC, Filippos Kolovos
Details | Diff | Splinter Review
Bug 19220 Allow XSLT processing for Z39.50 authority version 1 (2.58 KB, patch)
2021-06-14 06:56 UTC, Filippos Kolovos
Details | Diff | Splinter Review

Note You need to log in before you can comment on or make changes to this bug.
Description Katrin Fischer 2017-08-31 08:53:22 UTC
When configuring a bibliographic Z39.50 targets it's possible to list XSL scripts that change the record on import. 
While the option is also offered when adding authority Z39.50 targets, it doesn't work/hasn't been implemented.
It would be great if this feature was available for both types of targets.
Comment 1 Katrin Fischer 2019-03-17 21:10:36 UTC
*** Bug 20431 has been marked as a duplicate of this bug. ***
Comment 2 Matthias Meusburger 2020-12-22 14:53:36 UTC
Created attachment 114604 [details] [review]
Bug 19220: Allow XSLT processing for Z39.50 authority targets

Test plan:

 1) Apply the patch
 2) Edit an authority Z3950/SRU source in Home > Administration > Z39.50/SRU servers
 3) Add the path to an XSLT file in the "XSLT File(s) for transforming results" input, and save
 4) Remove the content of the import_records table to avoid cache issues
 5) Search for an authority with the "New from Z39.50/SRU" button in authorities home
 6) Check that the XSLT transformation has been applied, both in results list and in the import window

Here is an example XSLT which removes the 801 field from authorities:

<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:marc="http://www.loc.gov/MARC21/slim" version="1.0"
                xmlns:xsl="http://www.w3.org/1999/XSL/Transform">

  <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>

  <xsl:strip-space elements="*"/>

  <xsl:template match="@* | node()">
    <xsl:copy>
      <xsl:apply-templates select="@* | node()"/>
    </xsl:copy>
  </xsl:template>

  <xsl:template match="marc:datafield[@tag='801']">
  </xsl:template>

</xsl:stylesheet>
Comment 3 Matthias Meusburger 2020-12-22 14:55:53 UTC
I didn't add any new unit tests, since t/db_dependent/Breeding.t still passes with this patch, and _do_xslt_proc is already tested in it.
Comment 4 Katrin Fischer 2021-04-17 12:59:20 UTC
Hi Matts,

I am sorry, but I couldn't get this to work. :(

1) Applied patch
2) Restart all
3) Added a file to intranet-tmpl/prog/en/xslt/Del040.xsl using your template.
  Only change I made was 801 to 040.
4) Updated configuration for Z39.50 LOC for Names adding "Del040.xsl" to the
  last config option.
5) Did a search for "Müller". Checked MARC preview - 040 is shown.
6) Imported the record: 040 is there :(

Can you spot the mistake or verify it still works for you?
Comment 5 Matthias Meusburger 2021-05-17 14:13:55 UTC
Hi,

I've tested this again on master today, and it still works as expected as far as I'm concerned.

Two things that comes to mind:

 - You don't mention emptying the import_records table. Did you do it? Otherwise, if you select a previously searched record, the cached version will be used and the XSLT transformation will not happen.

 - Did you use the complete pathname when editing the Z39 server ? (ie: /home/koha/src/koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl and not koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl or Del040.xsl)
Comment 6 Katrin Fischer 2021-05-17 21:45:10 UTC
(In reply to Matthias Meusburger from comment #5)
> Hi,
> 
> I've tested this again on master today, and it still works as expected as
> far as I'm concerned.
> 
> Two things that comes to mind:
> 
>  - You don't mention emptying the import_records table. Did you do it?
> Otherwise, if you select a previously searched record, the cached version
> will be used and the XSLT transformation will not happen.

I did a quite random search, I don't think that it was in the import table or at least it would be quite unlikely.

>  - Did you use the complete pathname when editing the Z39 server ? (ie:
> /home/koha/src/koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl and not
> koha-tmpl/intranet-tmpl/prog/en/xslt/Del040.xsl or Del040.xsl)

I used the file name, as I hat put my sample one in the same directory as the existing ones listed in the documentation for bibliopgraphic Z39.50.
Is this not expected to work?

Changing status so another person could give this a shot.
Comment 7 Matthias Meusburger 2021-05-18 14:05:14 UTC
> 
> I did a quite random search, I don't think that it was in the import table
> or at least it would be quite unlikely.

Well, old entries in import_records are only deleted by misc/cronjobs/cleanup_database.pl as far as I know. So if you don't run this script at all, you can have very old entries.

Oh, and is "Müller" such a random search, since it's the most common family surname in Germany? :)

> I used the file name, as I hat put my sample one in the same directory as
> the existing ones listed in the documentation for bibliopgraphic Z39.50.
> Is this not expected to work?

You're absolutely right, as stated in https://koha-community.org/manual/20.11/en/html/administration.html#add-a-z39-50-target , a simple filename will be searched in /koha-tmpl/intranet-tmpl/prog/en/xslt/ , among other directories by the _get_best_default_xslt_filename function.

I just tested this myself, and it worked. So perhaps it was the import_records table that should have been emptied?

Another sidenote: 

You won't see the transformation in the preview window. This is debatable, but this is the way it already works for bibliographic records, and one can argue that the preview is meant to see how the record looks like in its source catalogue.

Anyway, thanks for testing :)
Comment 8 Katrin Fischer 2021-06-03 22:08:37 UTC
Sorry, Matts, I can't get this to work.

- I've made sure to search for all different terms
- The field I want to delete is still there after I imported the record

We really need someone else to give this a try.
Comment 9 Filippos Kolovos 2021-06-13 09:14:50 UTC
Dear Katrin,

Matthias is right. I just tested it and it does work. 

What I have noticed, is that for bibliographic records, the import script re-transforms the output XML EVERY time (even if the record is cached from a previous search).

For authority records, it prefers to fetch the cached version, as is, from the with the last transformation made (or not made if you didn't specify a custom XSLT).

So, you have to run the cleanup_database.pl tool like this (assuming that you have a package installation):

koha-foreach --chdir --enabled /usr/share/koha/bin/cronjobs/cleanup_database.pl --confirm --z3950

in order to clear the cached entries from z3950 searches. Otherwise you will always see the last result. I tested it and if I did not run the cleanup tool, the XSL didn't do anything. After I ran it, everytime I had the "XSL-updated" results.

Even if you remove the custom XSL from the Z3950 AUTH SERVER setting, then you have to rerun the script for it to work, otherwise it will still display the "XSL-updated" results from your previous queries.

I hope that this helps you,

-Fk
Comment 10 Katrin Fischer 2021-06-13 11:10:07 UTC
Matts, 

reading the comment above I feel like the caching will be an issue here - it will cause the transformation to to not work reliably as I have struggled to much with it. And I feel like this will be very confusing to libraries too that want to use the feature. 

But could we make it work "always", even if you happen to have "seen" the record before? I get that this would also be consistent with how it works for bibliographic records.

Just to make sure: My last test was done on a fresh sample database with lots of different names, so I believe there might be another cause too, even if everyone tells me otherwise :)
Comment 11 Filippos Kolovos 2021-06-13 12:43:03 UTC
Katrin,

Just one more thought on that, since you mention that the failure was consistent in your case.

Did you always checked the search to be done *only* on the Z3950 server on which you have configured the XSLT? Because everytime the search window is reopened all default search servers are pre-selected.
Comment 12 Katrin Fischer 2021-06-13 12:51:39 UTC
I have 2 authority targets configured, but made sure to only select results from the configured one with the XSLT transformation.
Comment 13 Filippos Kolovos 2021-06-13 16:35:06 UTC
Created attachment 121897 [details] [review]
Bug 19220 - Allow XSLT processing for Z39.50 authority

Matthias and Katrin,

I worked on it a bit and I seem to have found where the issue might be.
It is in the Breeding.pm module, where for the Authorities if an import_record_id already exists in the import_records table for that authority in the import_auths table, then it simply returns the import_record_id, ignoring the new MARC data that have been returned from the new XSL transformation.
If the auth record does not exist in the import_auths table, then the AddAuthToBatch() is called, which adds the auth into import_records, with the marc and marcxml correctly.

In the case of biblios, it always runs the AddBiblioToBatch(), even if the biblio is already in the cache, returning the new id, hence the new marc.

Please try the patch I have uploaded, where I update the import_records table for the specific import_record_id, with the new marc values (plain marc and marcxml), since they ae both used later on in the Marc Preview and import functionalities.
Comment 14 Filippos Kolovos 2021-06-14 06:54:08 UTC
Created attachment 121906 [details] [review]
Bug 19220- Allow XSLT processing for Z39.50 authority targets like for bibliographic targets

A minor correction in the previous patch in Breedin.pm, where it retrieves the marcflavour from preferences. The "shift" operation is not required
Comment 15 Filippos Kolovos 2021-06-14 06:56:54 UTC
Created attachment 121907 [details] [review]
Bug 19220 Allow XSLT processing for Z39.50 authority version 1
Comment 16 Fridolin Somers 2022-11-29 20:11:56 UTC
Any news ?

Are the 2 new patches an alternate ?
Comment 17 Filippos Kolovos 2022-12-02 10:48:48 UTC
Dear Fridolin Goodmorning,

As far as I am concerned, I have tested it in branches 20.x and 21.x and it works as expected.
Comment 18 Fridolin Somers 2023-08-08 20:56:56 UTC
Needs a rebase on current master
Comment 19 Fridolin Somers 2023-10-02 20:13:42 UTC
I suppose _do_xslt_proc() can be called just after SetUTF8Flag()
Comment 20 Fridolin Somers 2023-10-06 22:08:27 UTC
Arf for UNIMARC there is a strange behavior, we had to patch to always set MARCFlavor :
https://git.biblibre.com/biblibre/kohac/commit/cdb6e4cd1d56ae2904f7f722f21ba0f424a82190