Bug 33270

Summary: OAI-PMH should not die on record errors
Product: Koha Reporter: Nick Clemens (kidclamp) <nick>
Component: Architecture, internals, and plumbingAssignee: Nick Clemens (kidclamp) <nick>
Status: CLOSED FIXED QA Contact: Marcel de Rooy <m.de.rooy>
Severity: major    
Priority: P5 - low CC: dcook, lucas, m.de.rooy, tomascohen
Version: Main   
Hardware: All   
OS: All   
See Also: https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=34014
Change sponsored?: --- Patch complexity: Small patch
Documentation contact: Documentation submission:
Text to go in the release notes:
Version(s) released in:
23.11.00,23.05.02,22.11.08
Circulation function:
Bug Depends on: 23846, 29697    
Bug Blocks:    
Attachments: Bug 33270: Warn errors but catch exceptions when encountering record issues in OAI-PMH
Bug 33270: Warn errors but catch exceptions when encountering record issues in OAI-PMH
Bug 33270: Add record_strip_nonxml routine to Koha::Biblio::Metadata
Bug 33270: Attempt to recover from invalid metadata exception
Bug 33270: Add record_strip_nonxml routine to Koha::Biblio::Metadata
Bug 33270: Attempt to recover from invalid metadata exception
Bug 33270: (follow-up) Handle records that fail attempt to ignore bad characters
Bug 33270: Add record_strip_nonxml routine to Koha::Biblio::Metadata
Bug 33270: Attempt to recover from invalid metadata exception
Bug 33270: (follow-up) Handle records that fail attempt to ignore bad characters
Bug 33270: (QA follow-up) Tidy
Bug 33270: (QA follow-up) Do not change param hashref

Description Nick Clemens (kidclamp) 2023-03-17 18:23:04 UTC
Currently if the OAI listRecords command hits a bad record the script dies and there is no output.

We should wrap the code in an eval or try and handle the error gracefully, allowing the rest of the catalogue to be harvested and reporting the problem in a way it can be addressed
Comment 1 Jonathan Druart 2023-05-11 10:47:14 UTC
*** Bug 33279 has been marked as a duplicate of this bug. ***
Comment 2 Nick Clemens (kidclamp) 2023-06-22 11:25:45 UTC
Created attachment 152550 [details] [review]
Bug 33270: Warn errors but catch exceptions when encountering record issues in OAI-PMH

This patch alters the OAI code to attempt to load a badly encoded record by removing non XML characters. OAI harvests are used to keep external catalogs or display layers in sync with Koha. We should not break this connection when there is a bad record, but should allow the harvest to complete - warn the errors, and rely on Koha warnings/errors/tools to idnetify and fix the problem on the Koha side

Test plan, assumes using KTD default data - otherwise you need to find and import a record with encoding issues:
1 - Enable OAI-PMH system preference
2 - Browse to:
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=ListRecords&resumptionToken=marcxml/350////0/0/352
3 - 500 error:
    Invalid data, cannot decode metadata object (biblio_metadata.id=368, biblionumber=369, format=marcxml, schema=MARC21, decoding_error=':8: parser error : PCDATA invalid Char value 31...
4 - Apply patch, restart all
5 - Reload the page
6 - It loads!
7 - Click 'Metadata' for record 369 - it succeeds!
8 - Check the logs - confirm you see a warning of the record problem
Comment 3 Tomás Cohen Arazi (tcohen) 2023-06-22 11:45:37 UTC
Does the OAI-PMH protocol prescribe a special behavior in this cases?
Comment 4 Sam Lau 2023-06-22 16:20:02 UTC
Created attachment 152573 [details] [review]
Bug 33270: Warn errors but catch exceptions when encountering record issues in OAI-PMH

This patch alters the OAI code to attempt to load a badly encoded record by removing non XML characters. OAI harvests are used to keep external catalogs or display layers in sync with Koha. We should not break this connection when there is a bad record, but should allow the harvest to complete - warn the errors, and rely on Koha warnings/errors/tools to idnetify and fix the problem on the Koha side

Test plan, assumes using KTD default data - otherwise you need to find and import a record with encoding issues:
1 - Enable OAI-PMH system preference
2 - Browse to:
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=ListRecords&resumptionToken=marcxml/350////0/0/352
3 - 500 error:
    Invalid data, cannot decode metadata object (biblio_metadata.id=368, biblionumber=369, format=marcxml, schema=MARC21, decoding_error=':8: parser error : PCDATA invalid Char value 31...
4 - Apply patch, restart all
5 - Reload the page
6 - It loads!
7 - Click 'Metadata' for record 369 - it succeeds!
8 - Check the logs - confirm you see a warning of the record problem

Signed-off-by: Sam Lau <samalau@gmail.com>
Comment 5 Nick Clemens (kidclamp) 2023-06-22 16:31:06 UTC
(In reply to Tomás Cohen Arazi from comment #3)
> Does the OAI-PMH protocol prescribe a special behavior in this cases?

I suppose we should be sending an error element:
http://www.openarchives.org/OAI/2.0/guidelines-repository.htm#ErrorHandling

I consider this one a regression though - harvests used to ignore these, now they fail completely
Comment 6 David Cook 2023-06-22 23:42:31 UTC
I recently bumped into this problem as well, although I don't try to recover using StripNonXmlChars()... I don't know what I think about sending out a slightly different version than what's stored in the database...

But overall we do need a fix harvests choking on 1 bad record.
Comment 7 Marcel de Rooy 2023-06-23 07:36:02 UTC
(In reply to David Cook from comment #6)
> I recently bumped into this problem as well, although I don't try to recover
> using StripNonXmlChars()... I don't know what I think about sending out a
> slightly different version than what's stored in the database...

Agreed.
Yes, the new code with StripNonXmlChars here in OAI is a bit weird. Either we do such a thing in ->record or we dont?
Probably we dont. So could you only send an error and continue?

Moving to FQA
Comment 8 Nick Clemens (kidclamp) 2023-06-23 16:50:32 UTC
Created attachment 152649 [details] [review]
Bug 33270: Add record_strip_nonxml routine to Koha::Biblio::Metadata

This adds a routine that can strip non xml characters form a record.
It is intended for cases where we do not wish to throw an exception,
but rather need to process a record to allow other work to continue

To test:
prove -v t/db_dependent/Koha/Biblio/Metadata.t
Comment 9 Nick Clemens (kidclamp) 2023-06-23 16:50:35 UTC
Created attachment 152650 [details] [review]
Bug 33270: Attempt to recover from invalid metadata exception

This patch uses the new record_strip_nonxml routine to attempt to display
the record when it contains invalid characters

Rather than silently strippg these, we warn in the logs, then add an 'about'
container to the response. It is displayed nicely in the web view or sent as "INVALID_METADATA" in
the xml response

The 'error' codes for OAI seem to be at the request level, and the offered codes don't have a match
for a bad record. Adding the about when we can recover seems the most generous response

To test:
Test plan, assumes using KTD default data - otherwise you need to find and import a record with encoding issues:
 1 - Enable OAI-PMH system preference
 2 - Browse to:
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=ListRecords&resumptionToken=marcxml/350////0/0/352
 3 - 500 error:
    Invalid data, cannot decode metadata object (biblio_metadata.id=368, biblionumber=369, format=marcxml, schema=MARC21, decoding_error=':8: parser error : PCDATA invalid Char value 31...
 4 - Apply patch, restart all
 5 - Reload the page
 6 - It loads!
 7 - Click 'Metadata' for record 369 - it succeeds!
 8 - Check the logs - confirm you see a warning of the record problem
 9 - Confirm 369 has an about section
10 - Check the individul 'GetRecord' response as well
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=GetRecord&metadataPrefix=oai_dc&identifier=KOHA-OAI-TEST:369
Comment 10 Nick Clemens (kidclamp) 2023-06-23 17:00:34 UTC
(In reply to Marcel de Rooy from comment #7)
> (In reply to David Cook from comment #6)
> > I recently bumped into this problem as well, although I don't try to recover
> > using StripNonXmlChars()... I don't know what I think about sending out a
> > slightly different version than what's stored in the database...
> 
> Agreed.
> Yes, the new code with StripNonXmlChars here in OAI is a bit weird. Either
> we do such a thing in ->record or we dont?
> Probably we dont. So could you only send an error and continue?
> 
> Moving to FQA

We used to do this, see bug 29697

OAI doesn't seem to have errors at the record level, only the request, and they don't match to what is happening here:
https://www.openarchives.org/OAI/openarchivesprotocol.html#ErrorConditions

(In reply to David Cook from comment #6)
> I recently bumped into this problem as well, although I don't try to recover
> using StripNonXmlChars()... I don't know what I think about sending out a
> slightly different version than what's stored in the database...
> 
> But overall we do need a fix harvests choking on 1 bad record.

It is a bit odd, what we do in staff side is just show it and indiciate it may be degraded - I propose something similar here - strip them and add an 'about' section. This seems to be a more general way to send a message


We should ultimately prevent these records from being added to a catalog, but data can get messy. I restrict this now to a specific exception. I think it's a better use experience to handle it and warn when we can, rather than rejext totally.
Comment 11 Sam Lau 2023-06-28 18:09:19 UTC
Created attachment 152836 [details] [review]
Bug 33270: Add record_strip_nonxml routine to Koha::Biblio::Metadata

This adds a routine that can strip non xml characters form a record.
It is intended for cases where we do not wish to throw an exception,
but rather need to process a record to allow other work to continue

To test:
prove -v t/db_dependent/Koha/Biblio/Metadata.t

Signed-off-by: Sam Lau <samalau@gmail.com>
Comment 12 Sam Lau 2023-06-28 18:09:21 UTC
Created attachment 152837 [details] [review]
Bug 33270: Attempt to recover from invalid metadata exception

This patch uses the new record_strip_nonxml routine to attempt to display
the record when it contains invalid characters

Rather than silently strippg these, we warn in the logs, then add an 'about'
container to the response. It is displayed nicely in the web view or sent as "INVALID_METADATA" in
the xml response

The 'error' codes for OAI seem to be at the request level, and the offered codes don't have a match
for a bad record. Adding the about when we can recover seems the most generous response

To test:
Test plan, assumes using KTD default data - otherwise you need to find and import a record with encoding issues:
 1 - Enable OAI-PMH system preference
 2 - Browse to:
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=ListRecords&resumptionToken=marcxml/350////0/0/352
 3 - 500 error:
    Invalid data, cannot decode metadata object (biblio_metadata.id=368, biblionumber=369, format=marcxml, schema=MARC21, decoding_error=':8: parser error : PCDATA invalid Char value 31...
 4 - Apply patch, restart all
 5 - Reload the page
 6 - It loads!
 7 - Click 'Metadata' for record 369 - it succeeds!
 8 - Check the logs - confirm you see a warning of the record problem
 9 - Confirm 369 has an about section
10 - Check the individul 'GetRecord' response as well
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=GetRecord&metadataPrefix=oai_dc&identifier=KOHA-OAI-TEST:369

Signed-off-by: Sam Lau <samalau@gmail.com>
Comment 13 Marcel de Rooy 2023-06-30 09:29:53 UTC
Revisiting.
This code part somehow does not look good to me in OAI::Server::ListBase:

                if ($marcxml) {
                    $self->record(
                        Koha::OAI::Server::Record->new(
                            $repository, $marcxml, $timestamp, \@setSpecs,
                            %params
                        )
                    );
                } else {
                    $self->record(
                        Koha::OAI::Server::DeletedRecord->new(
                            $timestamp, \@setSpecs, identifier => $repository->{koha_identifier} . ':' . $biblionumber
                        )
                    );

If there is no marcxml, you now decide that it is a DeletedRecord. But I would assume that you should just test $deleted? It could be a normal record, but also crashing on the strip_nonxml route.
Please clarify.

Do we need additional tests in the db_dependent OAI scripts btw?
Comment 14 Nick Clemens (kidclamp) 2023-07-11 14:00:18 UTC
Created attachment 153325 [details] [review]
Bug 33270: (follow-up) Handle records that fail attempt to ignore bad characters
Comment 15 Marcel de Rooy 2023-07-17 13:11:48 UTC
Created attachment 153542 [details] [review]
Bug 33270: Add record_strip_nonxml routine to Koha::Biblio::Metadata

This adds a routine that can strip non xml characters form a record.
It is intended for cases where we do not wish to throw an exception,
but rather need to process a record to allow other work to continue

To test:
prove -v t/db_dependent/Koha/Biblio/Metadata.t

Signed-off-by: Sam Lau <samalau@gmail.com>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 16 Marcel de Rooy 2023-07-17 13:11:50 UTC
Created attachment 153543 [details] [review]
Bug 33270: Attempt to recover from invalid metadata exception

This patch uses the new record_strip_nonxml routine to attempt to display
the record when it contains invalid characters

Rather than silently strippg these, we warn in the logs, then add an 'about'
container to the response. It is displayed nicely in the web view or sent as "INVALID_METADATA" in
the xml response

The 'error' codes for OAI seem to be at the request level, and the offered codes don't have a match
for a bad record. Adding the about when we can recover seems the most generous response

To test:
Test plan, assumes using KTD default data - otherwise you need to find and import a record with encoding issues:
 1 - Enable OAI-PMH system preference
 2 - Browse to:
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=ListRecords&resumptionToken=marcxml/350////0/0/352
 3 - 500 error:
    Invalid data, cannot decode metadata object (biblio_metadata.id=368, biblionumber=369, format=marcxml, schema=MARC21, decoding_error=':8: parser error : PCDATA invalid Char value 31...
 4 - Apply patch, restart all
 5 - Reload the page
 6 - It loads!
 7 - Click 'Metadata' for record 369 - it succeeds!
 8 - Check the logs - confirm you see a warning of the record problem
 9 - Confirm 369 has an about section
10 - Check the individul 'GetRecord' response as well
    http://localhost:8080/cgi-bin/koha/oai.pl?verb=GetRecord&metadataPrefix=oai_dc&identifier=KOHA-OAI-TEST:369

Signed-off-by: Sam Lau <samalau@gmail.com>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 17 Marcel de Rooy 2023-07-17 13:11:53 UTC
Created attachment 153544 [details] [review]
Bug 33270: (follow-up) Handle records that fail attempt to ignore bad characters

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 18 Marcel de Rooy 2023-07-17 13:11:55 UTC
Created attachment 153545 [details] [review]
Bug 33270: (QA follow-up) Tidy

Resolve:
WARN   tidiness
  The file is less tidy than before (bad/messy lines before: 21, now: 24)

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 19 Marcel de Rooy 2023-07-17 13:11:58 UTC
Created attachment 153546 [details] [review]
Bug 33270: (QA follow-up) Do not change param hashref

Might just be a theoretical thing now, but safer to clone.

Test plan:
Run t/db_dependent/Koha/Biblio/Metadata.t

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comment 20 Marcel de Rooy 2023-07-17 13:13:37 UTC
(In reply to Marcel de Rooy from comment #13)
> Do we need additional tests in the db_dependent OAI scripts btw?

Perhaps we do. Final word for RM.

prove t/db_dependent/OAI/
t/db_dependent/OAI/AndSets.t .. ok
t/db_dependent/OAI/Server.t ... ok
t/db_dependent/OAI/Sets.t ..... ok
All tests successful.
Files=3, Tests=191,  9 wallclock secs ( 0.07 usr  0.03 sys +  6.86 cusr  0.74 csys =  7.70 CPU)
Result: PASS
Comment 21 Tomás Cohen Arazi (tcohen) 2023-07-18 15:47:48 UTC
Pushed to master for 23.11.

Nice work everyone, thanks!
Comment 22 Martin Renvoize (ashimema) 2023-07-19 08:30:39 UTC
Thanks for all the hard work!

Pushed to 23.05.x for the next release
Comment 23 Matt Blenkinsop 2023-07-19 10:15:28 UTC
Nice work everyone!

Pushed to oldstable for 22.11.x