Summary: | Some oversized records with UTF-8 characters cause import worker to die | ||
---|---|---|---|
Product: | Koha | Reporter: | Janusz Kaczmarek <januszop> |
Component: | MARC Bibliographic record staging/import | Assignee: | Bugs List <koha-bugs> |
Status: | NEW --- | QA Contact: | Testopia <testopia> |
Severity: | major | ||
Priority: | P5 - low | CC: | dcook, martin.renvoize |
Version: | unspecified | ||
Hardware: | All | ||
OS: | All | ||
See Also: |
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=38913 https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=35104 https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=40326 |
||
GIT URL: | Change sponsored?: | --- | |
Patch complexity: | --- | Documentation contact: | |
Documentation submission: | Text to go in the release notes: | ||
Version(s) released in: | Circulation function: | ||
Attachments: | Test record |
Description
Janusz Kaczmarek
2025-01-20 22:16:12 UTC
Created attachment 176845 [details]
Test record
A test MARCXML record to confirm the issue.
I was actually thinking about this a bit yesterday when fixing bug 38913. There are other places that call "new_from_usmarc". Some of them - like the API - have try/catch around them I think, but I'm sure not all the calls do. (In reply to David Cook from comment #2) > I was actually thinking about this a bit yesterday when fixing bug 38913. > > There are other places that call "new_from_usmarc". Some of them - like the > API - have try/catch around them I think, but I'm sure not all the calls do. Should be investigated from this angle. But the current can be solved with marcxml that we have anyway. No need to try here around new_from_usmarc if we can effectively import such records with new_from_xml IMO. Bumped into this one with ./misc/migration_tools/bulkmarcimport.pl In our case, I think it's bad leader data, but $record->as_usmarc is what is killing us. In C4::Biblio::ModBiblioMarc there is a call to $record->as_usmarc to re-calculate the record length, and that was throwing a fatal error, so I've wrapped that one in an eval{}... It looks like bug 38913 will still fall victim to this one too because of $record->as_usmarc used there too. I feel like some people reported issues even after bug 38913 with elastic indexing and maybe that's why... |