Summary: | Some oversized records with UTF-8 characters cause import worker to die | ||
---|---|---|---|
Product: | Koha | Reporter: | Janusz Kaczmarek <januszop> |
Component: | MARC Bibliographic record staging/import | Assignee: | Bugs List <koha-bugs> |
Status: | NEW --- | QA Contact: | Testopia <testopia> |
Severity: | major | ||
Priority: | P5 - low | CC: | dcook, martin.renvoize |
Version: | unspecified | ||
Hardware: | All | ||
OS: | All | ||
See Also: |
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=38913 https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=35104 |
||
GIT URL: | Change sponsored?: | --- | |
Patch complexity: | --- | Documentation contact: | |
Documentation submission: | Text to go in the release notes: | ||
Version(s) released in: | Circulation function: | ||
Attachments: | Test record |
Description
Janusz Kaczmarek
2025-01-20 22:16:12 UTC
Created attachment 176845 [details]
Test record
A test MARCXML record to confirm the issue.
I was actually thinking about this a bit yesterday when fixing bug 38913. There are other places that call "new_from_usmarc". Some of them - like the API - have try/catch around them I think, but I'm sure not all the calls do. (In reply to David Cook from comment #2) > I was actually thinking about this a bit yesterday when fixing bug 38913. > > There are other places that call "new_from_usmarc". Some of them - like the > API - have try/catch around them I think, but I'm sure not all the calls do. Should be investigated from this angle. But the current can be solved with marcxml that we have anyway. No need to try here around new_from_usmarc if we can effectively import such records with new_from_xml IMO. |